最近公司項(xiàng)目App中要集成二維碼掃描來適應(yīng)在戶外工作的時(shí)候,對(duì)碼頭集裝箱等上面貼的A4紙張打印的二維碼進(jìn)行識(shí)別, 一般App二維碼集成后,能掃出來就不管了,但是我們?cè)诩沙晒?根據(jù)用戶反饋,在戶外的環(huán)境下,很多二維碼識(shí)別不了,或者識(shí)別速度慢,我們自己也是適用了一下,發(fā)現(xiàn)也確實(shí)是這樣. 一般造成這個(gè)識(shí)別不出來的原因,我們總結(jié)了以下幾點(diǎn): A4紙張打印的標(biāo)簽二維碼, 本來打印就不是特別清晰,有些像素點(diǎn),不一定都打印了出來, 戶外環(huán)境,多變,灰塵,粉塵影響 日曬雨淋,還有各種劃痕,造成了二維碼破損和有污漬 Android手機(jī)配置不一樣,有的手機(jī)好一點(diǎn),像素高,有的用戶的像素低
大概就是這些,但是用基于QBar(在Zxing上做了優(yōu)化)的微信,卻能很快的識(shí)別出上面幾種情況造成的二維碼; 基于libqrencode 庫集成的支付寶或者釘釘二維碼掃描,一樣也能識(shí)別出來;還有IOS也就是調(diào)用系統(tǒng)的掃描,也一樣能夠掃描出來,為啥我們大android不行? 老板不管這些,只是說了,別人的可以,為啥你的不可以,那就是你的問題....... 網(wǎng)上找了很多各種幾千個(gè)贊的第三方集成的二維碼,根本就不能滿足上面的需求,當(dāng)時(shí)感覺真心不知道怎么辦好了. 唯獨(dú)網(wǎng)上找的這個(gè)還可以一點(diǎn):https://github.com/vondear/RxTool 但是破損一些的還是掃描不出來;那個(gè)網(wǎng)上幾千個(gè)贊的一片楓葉的 ,這種環(huán)境下的二維碼掃描根本邊都摸不到. 還有郭林推薦的:https://github.com/al4fun/SimpleScanner 這個(gè)庫,雖然是大神推薦的,但是比上面的這個(gè),還要差那么好幾點(diǎn),郭林的推薦的二維碼掃描鏈接:https://mp.weixin.qq.com/s/aPqSK1FlsPiENzSE48BVUA 沒辦法,最后只能自己動(dòng)手,網(wǎng)上找的,沒有找到合適的,目前我們修改的二維碼掃描基本可以做到:除了破損的識(shí)別不了,其他的都能識(shí)別,就是有時(shí)候速度慢了點(diǎn),要多對(duì)一下焦,勉強(qiáng)能夠比上面好那么一點(diǎn)點(diǎn)而已. 代碼如下: (這里面很多類都在原有類的基礎(chǔ)上有改動(dòng),雖然類名相同,但是里面的方法有些有變動(dòng)!) build.gradle api fileTree(include: ['*.jar'], dir: 'libs') api files('libs/core-3.3.0.jar') // provided 'com.android.support:appcompat-v7:26.1.0' compileOnly 'com.android.support:design:26.1.0' compileOnly 'com.android.support:support-vector-drawable:26.1.0'
文件目錄如下: ZxingConfig.java
public class ZxingConfig implements Serializable { private boolean isPlayBeep = true; private boolean isShake = false; private boolean isShowbottomLayout = true; private boolean isShowFlashLight = true; private boolean isShowAlbum = true; public boolean isPlayBeep() { public void setPlayBeep(boolean playBeep) { public boolean isShake() { public void setShake(boolean shake) { public boolean isShowbottomLayout() { return isShowbottomLayout; public void setShowbottomLayout(boolean showbottomLayout) { isShowbottomLayout = showbottomLayout; public boolean isShowFlashLight() { public void setShowFlashLight(boolean showFlashLight) { isShowFlashLight = showFlashLight; public boolean isShowAlbum() { public void setShowAlbum(boolean showAlbum) {
CameraFacing.java
public enum CameraFacing { BACK, // must be value 0! FRONT, // must be value 1!
OpenCamera.java
public final class OpenCamera { private final Camera camera; private final CameraFacing facing; private final int orientation; public OpenCamera(int index, Camera camera, CameraFacing facing, int orientation) { this.orientation = orientation; public Camera getCamera() { public CameraFacing getFacing() { public int getOrientation() { public String toString() { return "Camera #" + index + " : " + facing + ',' + orientation;
OpenCameraInterface.java
public final class OpenCameraInterface { private static final String TAG = OpenCameraInterface.class.getName(); private OpenCameraInterface() { * For {@link #open(int)}, means no preference for which camera to open. public static final int NO_REQUESTED_CAMERA = -1; * Opens the requested camera with {@link Camera#open(int)}, if one exists. * @param cameraId camera ID of the camera to use. A negative value * or {@link #NO_REQUESTED_CAMERA} means "no preference", in which case a rear-facing * camera is returned if possible or else any camera * @return handle to {@link OpenCamera} that was opened public static OpenCamera open(int cameraId) { int numCameras = Camera.getNumberOfCameras(); Log.w(TAG, "No cameras!"); boolean explicitRequest = cameraId >= 0; Camera.CameraInfo selectedCameraInfo = null; selectedCameraInfo = new Camera.CameraInfo(); Camera.getCameraInfo(index, selectedCameraInfo); while (index < numCameras) { Camera.CameraInfo cameraInfo = new Camera.CameraInfo(); Camera.getCameraInfo(index, cameraInfo); CameraFacing reportedFacing = CameraFacing.values()[cameraInfo.facing]; if (reportedFacing == CameraFacing.BACK) { selectedCameraInfo = cameraInfo; if (index < numCameras) { Log.i(TAG, "Opening camera #" + index); camera = Camera.open(index); Log.w(TAG, "Requested camera does not exist: " + cameraId); Log.i(TAG, "No camera facing " + CameraFacing.BACK + "; returning camera #0"); selectedCameraInfo = new Camera.CameraInfo(); Camera.getCameraInfo(0, selectedCameraInfo); return new OpenCamera(index, CameraFacing.values()[selectedCameraInfo.facing], selectedCameraInfo.orientation);
AutoFocusManager.java
final class AutoFocusManager implements Camera.AutoFocusCallback { private static final String TAG = AutoFocusManager.class.getSimpleName(); private static final long AUTO_FOCUS_INTERVAL_MS = 2000L; private static final Collection<String> FOCUS_MODES_CALLING_AF; FOCUS_MODES_CALLING_AF = new ArrayList<>(2); FOCUS_MODES_CALLING_AF.add(Camera.Parameters.FOCUS_MODE_AUTO); FOCUS_MODES_CALLING_AF.add(Camera.Parameters.FOCUS_MODE_MACRO); private boolean focusing; private final boolean useAutoFocus; private final Camera camera; private AsyncTask<?,?,?> outstandingTask; AutoFocusManager(Context context, Camera camera) { SharedPreferences sharedPrefs = PreferenceManager.getDefaultSharedPreferences(context); String currentFocusMode = camera.getParameters().getFocusMode(); sharedPrefs.getBoolean(QRConstants.KEY_AUTO_FOCUS, true) && FOCUS_MODES_CALLING_AF.contains(currentFocusMode); Log.i(TAG, "Current focus mode '" + currentFocusMode + "'; use auto focus? " + useAutoFocus); public synchronized void onAutoFocus(boolean success, Camera theCamera) { private synchronized void autoFocusAgainLater() { if (!stopped && outstandingTask == null) { AutoFocusTask newTask = new AutoFocusTask(); newTask.executeOnExecutor(AsyncTask.THREAD_POOL_EXECUTOR); outstandingTask = newTask; } catch (RejectedExecutionException ree) { Log.w(TAG, "Could not request auto focus", ree); synchronized void start() { if (!stopped && !focusing) { } catch (RuntimeException re) { // Have heard RuntimeException reported in Android 4.0.x+; continue? Log.w(TAG, "Unexpected exception while focusing", re); // Try again later to keep cycle going private synchronized void cancelOutstandingTask() { if (outstandingTask != null) { if (outstandingTask.getStatus() != AsyncTask.Status.FINISHED) { outstandingTask.cancel(true); synchronized void stop() { // Doesn't hurt to call this even if not focusing camera.cancelAutoFocus(); } catch (RuntimeException re) { // Have heard RuntimeException reported in Android 4.0.x+; continue? Log.w(TAG, "Unexpected exception while cancelling focusing", re); private final class AutoFocusTask extends AsyncTask<Object,Object,Object> { protected Object doInBackground(Object... voids) { Thread.sleep(AUTO_FOCUS_INTERVAL_MS); } catch (InterruptedException e) {
CameraConfigurationManager.java
final class CameraConfigurationManager { private static final String TAG = "CameraConfiguration"; private final Context context; private int cwNeededRotation; private int cwRotationFromDisplayToCamera; private Point screenResolution; private Point cameraResolution; private Point bestPreviewSize; private Point previewSizeOnScreen; CameraConfigurationManager(Context context) { * Reads, one time, values from the camera that are needed by the app. void initFromCameraParameters(OpenCamera camera) { Camera.Parameters parameters = camera.getCamera().getParameters(); WindowManager manager = (WindowManager) context.getSystemService(Context.WINDOW_SERVICE); Display display = manager.getDefaultDisplay(); int displayRotation = display.getRotation(); int cwRotationFromNaturalToDisplay; switch (displayRotation) { cwRotationFromNaturalToDisplay = 0; case Surface.ROTATION_90: cwRotationFromNaturalToDisplay = 90; case Surface.ROTATION_180: cwRotationFromNaturalToDisplay = 180; case Surface.ROTATION_270: cwRotationFromNaturalToDisplay = 270; // Have seen this return incorrect values like -90 if (displayRotation % 90 == 0) { cwRotationFromNaturalToDisplay = (360 + displayRotation) % 360; throw new IllegalArgumentException("Bad rotation: " + displayRotation); Log.i(TAG, "Display at: " + cwRotationFromNaturalToDisplay); int cwRotationFromNaturalToCamera = camera.getOrientation(); Log.i(TAG, "Camera at: " + cwRotationFromNaturalToCamera); // Still not 100% sure about this. But acts like we need to flip this: if (camera.getFacing() == CameraFacing.FRONT) { cwRotationFromNaturalToCamera = (360 - cwRotationFromNaturalToCamera) % 360; Log.i(TAG, "Front camera overriden to: " + cwRotationFromNaturalToCamera); SharedPreferences prefs = PreferenceManager.getDefaultSharedPreferences(context); String overrideRotationString; if (camera.getFacing() == CameraFacing.FRONT) { overrideRotationString = prefs.getString(PreferencesActivity.KEY_FORCE_CAMERA_ORIENTATION_FRONT, null); overrideRotationString = prefs.getString(PreferencesActivity.KEY_FORCE_CAMERA_ORIENTATION, null); if (overrideRotationString != null && !"-".equals(overrideRotationString)) { Log.i(TAG, "Overriding camera manually to " + overrideRotationString); cwRotationFromNaturalToCamera = Integer.parseInt(overrideRotationString); cwRotationFromDisplayToCamera = (360 + cwRotationFromNaturalToCamera - cwRotationFromNaturalToDisplay) % 360; Log.i(TAG, "Final display orientation: " + cwRotationFromDisplayToCamera); if (camera.getFacing() == CameraFacing.FRONT) { Log.i(TAG, "Compensating rotation for front camera"); cwNeededRotation = (360 - cwRotationFromDisplayToCamera) % 360; cwNeededRotation = cwRotationFromDisplayToCamera; Log.i(TAG, "Clockwise rotation from display to camera: " + cwNeededRotation); Point theScreenResolution = new Point(); display.getSize(theScreenResolution); screenResolution = theScreenResolution; Log.i(TAG, "Screen resolution in current orientation: " + screenResolution); cameraResolution = CameraConfigurationUtils.findBestPreviewSizeValue(parameters, screenResolution); Log.i(TAG, "Camera resolution: " + cameraResolution); bestPreviewSize = CameraConfigurationUtils.findBestPreviewSizeValue(parameters, screenResolution); Log.i(TAG, "Best available preview size: " + bestPreviewSize); boolean isScreenPortrait = screenResolution.x < screenResolution.y; boolean isPreviewSizePortrait = bestPreviewSize.x < bestPreviewSize.y; if (isScreenPortrait == isPreviewSizePortrait) { previewSizeOnScreen = bestPreviewSize; previewSizeOnScreen = new Point(bestPreviewSize.y, bestPreviewSize.x); Log.i(TAG, "Preview size on screen: " + previewSizeOnScreen); void setDesiredCameraParameters(OpenCamera camera, boolean safeMode) { Camera theCamera = camera.getCamera(); Camera.Parameters parameters = theCamera.getParameters(); if (parameters == null) { Log.w(TAG, "Device error: no camera parameters are available. Proceeding without configuration."); Log.i(TAG, "Initial camera parameters: " + parameters.flatten()); Log.w(TAG, "In camera config safe mode -- most settings will not be honored"); // SharedPreferences prefs = PreferenceManager.getDefaultSharedPreferences(context); initializeTorch(parameters, safeMode, QRConstants.disableExposure); CameraConfigurationUtils.setFocus( //CameraConfigurationUtils.setInvertColor(parameters); CameraConfigurationUtils.setBarcodeSceneMode(parameters); CameraConfigurationUtils.setVideoStabilization(parameters); CameraConfigurationUtils.setFocusArea(parameters); CameraConfigurationUtils.setMetering(parameters); parameters.setPreviewSize(bestPreviewSize.x, bestPreviewSize.y); theCamera.setParameters(parameters); theCamera.setDisplayOrientation(cwRotationFromDisplayToCamera); Camera.Parameters afterParameters = theCamera.getParameters(); Camera.Size afterSize = afterParameters.getPreviewSize(); if (afterSize != null && (bestPreviewSize.x != afterSize.width || bestPreviewSize.y != afterSize.height)) { Log.w(TAG, "Camera said it supported preview size " + bestPreviewSize.x + 'x' + bestPreviewSize.y + ", but after setting it, preview size is " + afterSize.width + 'x' + afterSize.height); bestPreviewSize.x = afterSize.width; bestPreviewSize.y = afterSize.height; Point getBestPreviewSize() { Point getPreviewSizeOnScreen() { return previewSizeOnScreen; Point getCameraResolution() { Point getScreenResolution() { int getCWNeededRotation() { boolean getTorchState(Camera camera) { Camera.Parameters parameters = camera.getParameters(); if (parameters != null) { String flashMode = camera.getParameters().getFlashMode(); return flashMode != null && (Camera.Parameters.FLASH_MODE_ON.equals(flashMode) || Camera.Parameters.FLASH_MODE_TORCH.equals(flashMode)); void setTorch(Camera camera, boolean newSetting) { Camera.Parameters parameters = camera.getParameters(); doSetTorch(parameters, newSetting, false, QRConstants.disableExposure); camera.setParameters(parameters); private void initializeTorch(Camera.Parameters parameters, boolean safeMode, boolean disableExposure) { boolean currentSetting = QRConstants.frontLightMode == FrontLightMode.ON; doSetTorch(parameters, currentSetting, safeMode, disableExposure); private void doSetTorch(Camera.Parameters parameters, boolean newSetting, boolean safeMode, boolean disableExposure) { CameraConfigurationUtils.setTorch(parameters, newSetting); if (!safeMode && !disableExposure) { CameraConfigurationUtils.setBestExposure(parameters, newSetting);
CameraConfigurationUtils.java
@TargetApi(Build.VERSION_CODES.ICE_CREAM_SANDWICH_MR1) public final class CameraConfigurationUtils { private static final String TAG = "CameraConfiguration"; private static final Pattern SEMICOLON = Pattern.compile(";"); private static final int MIN_PREVIEW_PIXELS = 480 * 320; // normal screen private static final float MAX_EXPOSURE_COMPENSATION = 1.5f; private static final float MIN_EXPOSURE_COMPENSATION = 0.0f; private static final double MAX_ASPECT_DISTORTION = 0.15; private static final int MIN_FPS = 10; private static final int MAX_FPS = 20; private static final int AREA_PER_1000 = 400; private CameraConfigurationUtils() { public static void setFocus(Camera.Parameters parameters, boolean disableContinuous, List<String> supportedFocusModes = parameters.getSupportedFocusModes(); if (safeMode || disableContinuous) { focusMode = findSettableValue("focus mode", Camera.Parameters.FOCUS_MODE_AUTO); focusMode = findSettableValue("focus mode", Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE, Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO, Camera.Parameters.FOCUS_MODE_AUTO); // Maybe selected auto-focus but not available, so fall through here: if (!safeMode && focusMode == null) { focusMode = findSettableValue("focus mode", Camera.Parameters.FOCUS_MODE_MACRO, Camera.Parameters.FOCUS_MODE_EDOF); if (focusMode.equals(parameters.getFocusMode())) { Log.i(TAG, "Focus mode already set to " + focusMode); parameters.setFocusMode(focusMode); public static void setTorch(Camera.Parameters parameters, boolean on) { List<String> supportedFlashModes = parameters.getSupportedFlashModes(); flashMode = findSettableValue("flash mode", Camera.Parameters.FLASH_MODE_TORCH, Camera.Parameters.FLASH_MODE_ON); flashMode = findSettableValue("flash mode", Camera.Parameters.FLASH_MODE_OFF); if (flashMode.equals(parameters.getFlashMode())) { Log.i(TAG, "Flash mode already set to " + flashMode); Log.i(TAG, "Setting flash mode to " + flashMode); parameters.setFlashMode(flashMode); public static void setBestExposure(Camera.Parameters parameters, boolean lightOn) { int minExposure = parameters.getMinExposureCompensation(); int maxExposure = parameters.getMaxExposureCompensation(); float step = parameters.getExposureCompensationStep(); if ((minExposure != 0 || maxExposure != 0) && step > 0.0f) { // Set low when light is on float targetCompensation = lightOn ? MIN_EXPOSURE_COMPENSATION : MAX_EXPOSURE_COMPENSATION; int compensationSteps = Math.round(targetCompensation / step); float actualCompensation = step * compensationSteps; compensationSteps = Math.max(Math.min(compensationSteps, maxExposure), minExposure); if (parameters.getExposureCompensation() == compensationSteps) { Log.i(TAG, "Exposure compensation already set to " + compensationSteps + " / " + actualCompensation); Log.i(TAG, "Setting exposure compensation to " + compensationSteps + " / " + actualCompensation); parameters.setExposureCompensation(compensationSteps); Log.i(TAG, "Camera does not support exposure compensation"); public static void setBestPreviewFPS(Camera.Parameters parameters) { setBestPreviewFPS(parameters, MIN_FPS, MAX_FPS); public static void setBestPreviewFPS(Camera.Parameters parameters, int minFPS, int maxFPS) { List<int[]> supportedPreviewFpsRanges = parameters.getSupportedPreviewFpsRange(); Log.i(TAG, "Supported FPS ranges: " + toString(supportedPreviewFpsRanges)); if (supportedPreviewFpsRanges != null && !supportedPreviewFpsRanges.isEmpty()) { int[] suitableFPSRange = null; for (int[] fpsRange : supportedPreviewFpsRanges) { int thisMin = fpsRange[Camera.Parameters.PREVIEW_FPS_MIN_INDEX]; int thisMax = fpsRange[Camera.Parameters.PREVIEW_FPS_MAX_INDEX]; if (thisMin >= minFPS * 1000 && thisMax <= maxFPS * 1000) { suitableFPSRange = fpsRange; if (suitableFPSRange == null) { Log.i(TAG, "No suitable FPS range?"); int[] currentFpsRange = new int[2]; parameters.getPreviewFpsRange(currentFpsRange); if (Arrays.equals(currentFpsRange, suitableFPSRange)) { Log.i(TAG, "FPS range already set to " + Arrays.toString(suitableFPSRange)); Log.i(TAG, "Setting FPS range to " + Arrays.toString(suitableFPSRange)); parameters.setPreviewFpsRange(suitableFPSRange[Camera.Parameters.PREVIEW_FPS_MIN_INDEX], suitableFPSRange[Camera.Parameters.PREVIEW_FPS_MAX_INDEX]); public static void setFocusArea(Camera.Parameters parameters) { if (parameters.getMaxNumFocusAreas() > 0) { Log.i(TAG, "Old focus areas: " + toString(parameters.getFocusAreas())); List<Camera.Area> middleArea = buildMiddleArea(AREA_PER_1000); Log.i(TAG, "Setting focus area to : " + toString(middleArea)); parameters.setFocusAreas(middleArea); Log.i(TAG, "Device does not support focus areas"); public static void setMetering(Camera.Parameters parameters) { if (parameters.getMaxNumMeteringAreas() > 0) { Log.i(TAG, "Old metering areas: " + parameters.getMeteringAreas()); List<Camera.Area> middleArea = buildMiddleArea(AREA_PER_1000); Log.i(TAG, "Setting metering area to : " + toString(middleArea)); parameters.setMeteringAreas(middleArea); Log.i(TAG, "Device does not support metering areas"); private static List<Camera.Area> buildMiddleArea(int areaPer1000) { return Collections.singletonList( new Camera.Area(new Rect(-areaPer1000, -areaPer1000, areaPer1000, areaPer1000), 1)); public static void setVideoStabilization(Camera.Parameters parameters) { if (parameters.isVideoStabilizationSupported()) { if (parameters.getVideoStabilization()) { Log.i(TAG, "Video stabilization already enabled"); Log.i(TAG, "Enabling video stabilization..."); parameters.setVideoStabilization(true); Log.i(TAG, "This device does not support video stabilization"); public static void setBarcodeSceneMode(Camera.Parameters parameters) { if (Camera.Parameters.SCENE_MODE_BARCODE.equals(parameters.getSceneMode())) { Log.i(TAG, "Barcode scene mode already set"); String sceneMode = findSettableValue("scene mode", parameters.getSupportedSceneModes(), Camera.Parameters.SCENE_MODE_BARCODE); parameters.setSceneMode(sceneMode); public static void setZoom(Camera.Parameters parameters, double targetZoomRatio) { if (parameters.isZoomSupported()) { Integer zoom = indexOfClosestZoom(parameters, targetZoomRatio); if (parameters.getZoom() == zoom) { Log.i(TAG, "Zoom is already set to " + zoom); Log.i(TAG, "Setting zoom to " + zoom); parameters.setZoom(zoom); Log.i(TAG, "Zoom is not supported"); private static Integer indexOfClosestZoom(Camera.Parameters parameters, double targetZoomRatio) { List<Integer> ratios = parameters.getZoomRatios(); Log.i(TAG, "Zoom ratios: " + ratios); int maxZoom = parameters.getMaxZoom(); if (ratios == null || ratios.isEmpty() || ratios.size() != maxZoom + 1) { Log.w(TAG, "Invalid zoom ratios!"); double target100 = 100.0 * targetZoomRatio; double smallestDiff = Double.POSITIVE_INFINITY; for (int i = 0; i < ratios.size(); i++) { double diff = Math.abs(ratios.get(i) - target100); if (diff < smallestDiff) { Log.i(TAG, "Chose zoom ratio of " + (ratios.get(closestIndex) / 100.0)); public static void setInvertColor(Camera.Parameters parameters) { if (Camera.Parameters.EFFECT_NEGATIVE.equals(parameters.getColorEffect())) { Log.i(TAG, "Negative effect already set"); String colorMode = findSettableValue("color effect", parameters.getSupportedColorEffects(), Camera.Parameters.EFFECT_NEGATIVE); parameters.setColorEffect(colorMode); public static Point findBestPreviewSizeValue(Camera.Parameters parameters, Point screenResolution) { List<Camera.Size> rawSupportedSizes = parameters.getSupportedPreviewSizes(); if (rawSupportedSizes == null) { Log.w(TAG, "Device returned no supported preview sizes; using default"); Camera.Size defaultSize = parameters.getPreviewSize(); if (defaultSize == null) { throw new IllegalStateException("Parameters contained no preview size!"); return new Point(defaultSize.width, defaultSize.height); // Sort by size, descending List<Camera.Size> supportedPreviewSizes = new ArrayList<>(rawSupportedSizes); Collections.sort(supportedPreviewSizes, new Comparator<Camera.Size>() { public int compare(Camera.Size a, Camera.Size b) { int aPixels = a.height * a.width; int bPixels = b.height * b.width; if (Log.isLoggable(TAG, Log.INFO)) { StringBuilder previewSizesString = new StringBuilder(); for (Camera.Size supportedPreviewSize : supportedPreviewSizes) { previewSizesString.append(supportedPreviewSize.width).append('x') .append(supportedPreviewSize.height).append(' '); Log.i(TAG, "Supported preview sizes: " + previewSizesString); double screenAspectRatio = (double) screenResolution.x / (double) screenResolution.y; // Remove sizes that are unsuitable Iterator<Camera.Size> it = supportedPreviewSizes.iterator(); Camera.Size supportedPreviewSize = it.next(); int realWidth = supportedPreviewSize.width; int realHeight = supportedPreviewSize.height; if (realWidth * realHeight < MIN_PREVIEW_PIXELS) { boolean isCandidatePortrait = realWidth < realHeight; int maybeFlippedWidth = isCandidatePortrait ? realHeight : realWidth; int maybeFlippedHeight = isCandidatePortrait ? realWidth : realHeight; double aspectRatio = (double) maybeFlippedWidth / (double) maybeFlippedHeight; double distortion = Math.abs(aspectRatio - screenAspectRatio); if (distortion > MAX_ASPECT_DISTORTION) { if (maybeFlippedWidth == screenResolution.x && maybeFlippedHeight == screenResolution.y) { Point exactPoint = new Point(realWidth, realHeight); Log.i(TAG, "Found preview size exactly matching screen size: " + exactPoint); // If no exact match, use largest preview size. This was not a great idea on older devices because // of the additional computation needed. We're likely to get here on newer Android 4+ devices, where // the CPU is much more powerful. if (!supportedPreviewSizes.isEmpty()) { Camera.Size largestPreview = supportedPreviewSizes.get(0); Point largestSize = new Point(largestPreview.width, largestPreview.height); Log.i(TAG, "Using largest suitable preview size: " + largestSize); // If there is nothing at all suitable, return current preview size Camera.Size defaultPreview = parameters.getPreviewSize(); if (defaultPreview == null) { throw new IllegalStateException("Parameters contained no preview size!"); Point defaultSize = new Point(defaultPreview.width, defaultPreview.height); Log.i(TAG, "No suitable preview sizes, using default: " + defaultSize); private static String findSettableValue(String name, Collection<String> supportedValues, String... desiredValues) { Log.i(TAG, "Requesting " + name + " value from among: " + Arrays.toString(desiredValues)); Log.i(TAG, "Supported " + name + " values: " + supportedValues); if (supportedValues != null) { for (String desiredValue : desiredValues) { if (supportedValues.contains(desiredValue)) { Log.i(TAG, "Can set " + name + " to: " + desiredValue); Log.i(TAG, "No supported values match"); private static String toString(Collection<int[]> arrays) { if (arrays == null || arrays.isEmpty()) { StringBuilder buffer = new StringBuilder(); Iterator<int[]> it = arrays.iterator(); buffer.append(Arrays.toString(it.next())); return buffer.toString(); private static String toString(Iterable<Camera.Area> areas) { StringBuilder result = new StringBuilder(); for (Camera.Area area : areas) { result.append(area.rect).append(':').append(area.weight).append(' '); return result.toString(); public static String collectStats(Camera.Parameters parameters) { return collectStats(parameters.flatten()); public static String collectStats(CharSequence flattenedParams) { StringBuilder result = new StringBuilder(1000); result.append("BOARD=").append(Build.BOARD).append('\n'); result.append("BRAND=").append(Build.BRAND).append('\n'); result.append("CPU_ABI=").append(Build.CPU_ABI).append('\n'); result.append("DEVICE=").append(Build.DEVICE).append('\n'); result.append("DISPLAY=").append(Build.DISPLAY).append('\n'); result.append("FINGERPRINT=").append(Build.FINGERPRINT).append('\n'); result.append("HOST=").append(Build.HOST).append('\n'); result.append("ID=").append(Build.ID).append('\n'); result.append("MANUFACTURER=").append(Build.MANUFACTURER).append('\n'); result.append("MODEL=").append(Build.MODEL).append('\n'); result.append("PRODUCT=").append(Build.PRODUCT).append('\n'); result.append("TAGS=").append(Build.TAGS).append('\n'); result.append("TIME=").append(Build.TIME).append('\n'); result.append("TYPE=").append(Build.TYPE).append('\n'); result.append("USER=").append(Build.USER).append('\n'); result.append("VERSION.CODENAME=").append(Build.VERSION.CODENAME).append('\n'); result.append("VERSION.INCREMENTAL=").append(Build.VERSION.INCREMENTAL).append('\n'); result.append("VERSION.RELEASE=").append(Build.VERSION.RELEASE).append('\n'); result.append("VERSION.SDK_INT=").append(Build.VERSION.SDK_INT).append('\n'); if (flattenedParams != null) { String[] params = SEMICOLON.split(flattenedParams); for (String param : params) { result.append(param).append('\n'); return result.toString();
CameraManager.java
public final class CameraManager { private static final String TAG = CameraManager.class.getSimpleName(); private static final int MIN_FRAME_WIDTH = 240; private static final int MIN_FRAME_HEIGHT = 240; private static final int MAX_FRAME_WIDTH = 1200; // = 5/8 * 1920 private static final int MAX_FRAME_HEIGHT = 675; // = 5/8 * 1080 private final Context context; private final CameraConfigurationManager configManager; private OpenCamera camera; private AutoFocusManager autoFocusManager; private Rect framingRect; private Rect framingRectInPreview; private boolean initialized; private boolean previewing; private int requestedCameraId = OpenCameraInterface.NO_REQUESTED_CAMERA; private int requestedFramingRectWidth; private int requestedFramingRectHeight; * Preview frames are delivered here, which we pass on to the registered handler. Make sure to * clear the handler so it will only receive one message. private final PreviewCallback previewCallback; public CameraManager(Context context) { this.configManager = new CameraConfigurationManager(context); previewCallback = new PreviewCallback(configManager); * Opens the camera driver and initializes the hardware parameters. * @param holder The surface object which the camera will draw preview frames into. * @throws IOException Indicates the camera driver failed to open. public synchronized void openDriver(SurfaceHolder holder) throws IOException { OpenCamera theCamera = camera; theCamera = OpenCameraInterface.open(requestedCameraId); throw new IOException("Camera.open() failed to return object from driver"); configManager.initFromCameraParameters(theCamera); if (requestedFramingRectWidth > 0 && requestedFramingRectHeight > 0) { setManualFramingRect(requestedFramingRectWidth, requestedFramingRectHeight); requestedFramingRectWidth = 0; requestedFramingRectHeight = 0; Camera cameraObject = theCamera.getCamera(); Camera.Parameters parameters = cameraObject.getParameters(); String parametersFlattened = parameters == null ? null : parameters.flatten(); // Save these, temporarily configManager.setDesiredCameraParameters(theCamera, false); } catch (RuntimeException re) { Log.w(TAG, "Camera rejected parameters. Setting only minimal safe-mode parameters"); Log.i(TAG, "Resetting to saved camera params: " + parametersFlattened); if (parametersFlattened != null) { parameters = cameraObject.getParameters(); parameters.unflatten(parametersFlattened); cameraObject.setParameters(parameters); configManager.setDesiredCameraParameters(theCamera, true); } catch (RuntimeException re2) { Log.w(TAG, "Camera rejected even safe-mode parameters! No configuration"); cameraObject.setPreviewDisplay(holder); public synchronized boolean isOpen() { * Closes the camera driver if still in use. public synchronized void closeDriver() { camera.getCamera().release(); // Make sure to clear these each time we close the camera, so that any scanning rect // requested by intent is forgotten. framingRectInPreview = null; * Asks the camera hardware to begin drawing preview frames to the screen. public synchronized void startPreview() { OpenCamera theCamera = camera; if (theCamera != null && !previewing) { theCamera.getCamera().startPreview(); autoFocusManager = new AutoFocusManager(context, theCamera.getCamera()); * Tells the camera to stop drawing preview frames. public synchronized void stopPreview() { if (autoFocusManager != null) { if (camera != null && previewing) { camera.getCamera().stopPreview(); previewCallback.setHandler(null, 0); * Convenience method for {@link CaptureActivity} * @param newSetting if {@code true}, light should be turned on if currently off. And vice versa. public synchronized void setTorch(boolean newSetting) { OpenCamera theCamera = camera; if (newSetting != configManager.getTorchState(theCamera.getCamera())) { boolean wasAutoFocusManager = autoFocusManager != null; if (wasAutoFocusManager) { configManager.setTorch(theCamera.getCamera(), newSetting); if (wasAutoFocusManager) { autoFocusManager = new AutoFocusManager(context, theCamera.getCamera()); autoFocusManager.start(); * A single preview frame will be returned to the handler supplied. The data will arrive as byte[] * in the message.obj field, with width and height encoded as message.arg1 and message.arg2, * @param handler The handler to send the message to. * @param message The what field of the message to be sent. public synchronized void requestPreviewFrame(Handler handler, int message) { OpenCamera theCamera = camera; if (theCamera != null && previewing) { previewCallback.setHandler(handler, message); theCamera.getCamera().setOneShotPreviewCallback(previewCallback); * Calculates the framing rect which the UI should draw to show the user where to place the * barcode. This target helps with alignment as well as forces the user to hold the device * far enough away to ensure the image will be in focus. * @return The rectangle to draw on screen in window coordinates. public synchronized Rect getFramingRect() { if (framingRect == null) { Point screenResolution = configManager.getScreenResolution(); if (screenResolution == null) { // Called early, before init even finished int width = findDesiredDimensionInRange(screenResolution.x, MIN_FRAME_WIDTH, MAX_FRAME_WIDTH); int height = findDesiredDimensionInRange(screenResolution.y, MIN_FRAME_HEIGHT, MAX_FRAME_HEIGHT); //保持掃描框?qū)捀咭恢?/div> int leftOffset = (screenResolution.x - finalSize) / 2; int topOffset = (screenResolution.y - finalSize) / 2; framingRect = new Rect(leftOffset, topOffset, leftOffset + finalSize, topOffset + finalSize); Log.d(TAG, "Calculated framing rect: " + framingRect + " width =" + width + " height = " + height + " screenResolution.x = " + screenResolution.x + " screenResolution.y = " + screenResolution.y); private static int findDesiredDimensionInRange(int resolution, int hardMin, int hardMax) { int dim = 5 * resolution / 8; // Target 5/8 of each dimension * Like {@link #getFramingRect} but coordinates are in terms of the preview frame, * @return {@link Rect} expressing barcode scan area in terms of the preview size public synchronized Rect getFramingRectInPreview() { if (framingRectInPreview == null) { Rect framingRect = getFramingRect(); if (framingRect == null) { Rect rect = new Rect(framingRect); Point cameraResolution = configManager.getCameraResolution(); Point screenResolution = configManager.getScreenResolution(); if (cameraResolution == null || screenResolution == null) { // Called early, before init even finished rect.left = rect.left * cameraResolution.x / screenResolution.x; rect.right = rect.right * cameraResolution.x / screenResolution.x; rect.top = rect.top * cameraResolution.y / screenResolution.y; rect.bottom = rect.bottom * cameraResolution.y / screenResolution.y; framingRectInPreview = rect; return framingRectInPreview; * Allows third party apps to specify the camera ID, rather than determine * it automatically based on available cameras and their orientation. * @param cameraId camera ID of the camera to use. A negative value means "no preference". public synchronized void setManualCameraId(int cameraId) { requestedCameraId = cameraId; * Allows third party apps to specify the scanning rectangle dimensions, rather than determine * them automatically based on screen resolution. * @param width The width in pixels to scan. * @param height The height in pixels to scan. public synchronized void setManualFramingRect(int width, int height) { Point screenResolution = configManager.getScreenResolution(); if (width > screenResolution.x) { width = screenResolution.x; if (height > screenResolution.y) { height = screenResolution.y; int leftOffset = (screenResolution.x - width) / 2; int topOffset = (screenResolution.y - height) / 2; framingRect = new Rect(leftOffset, topOffset, leftOffset + width, topOffset + height); Log.d(TAG, "Calculated manual framing rect: " + framingRect); framingRectInPreview = null; requestedFramingRectWidth = width; requestedFramingRectHeight = height; * A factory method to build the appropriate LuminanceSource object based on the format * of the preview buffers, as described by Camera.Parameters. * @param data A preview frame. * @param width The width of the image. * @param height The height of the image. * @return A PlanarYUVLuminanceSource instance. public PlanarYUVLuminanceSource buildLuminanceSource(byte[] data, int width, int height) { Rect rect = getFramingRectInPreview(); // Go ahead and assume it's YUV rather than die. return new PlanarYUVLuminanceSource(data, width, height, rect.left, rect.top, rect.width(), rect.height(), false); public void openLight() { if (camera != null && camera.getCamera() != null) { Camera.Parameters parameter = camera.getCamera().getParameters(); parameter.setFlashMode(Camera.Parameters.FLASH_MODE_TORCH); camera.getCamera().setParameters(parameter); if (camera != null && camera.getCamera() != null) { Camera.Parameters parameter = camera.getCamera().getParameters(); parameter.setFlashMode(Camera.Parameters.FLASH_MODE_OFF); camera.getCamera().setParameters(parameter); FrontLightMode.java
public enum FrontLightMode { /** On only when ambient light is low. */ /* private static FrontLightMode parse(String modeString) { return modeString == null ? OFF : valueOf(modeString); public static FrontLightMode readPref(SharedPreferences sharedPrefs) { return parse(sharedPrefs.getString(PreferencesActivity.KEY_FRONT_LIGHT_MODE, OFF.toString()));
PreviewCallback.java
final class PreviewCallback implements Camera.PreviewCallback { private static final String TAG = PreviewCallback.class.getSimpleName(); private final CameraConfigurationManager configManager; private Handler previewHandler; private int previewMessage; PreviewCallback(CameraConfigurationManager configManager) { this.configManager = configManager; void setHandler(Handler previewHandler, int previewMessage) { this.previewHandler = previewHandler; this.previewMessage = previewMessage; public void onPreviewFrame(byte[] data, Camera camera) { Point cameraResolution = configManager.getCameraResolution(); Handler thePreviewHandler = previewHandler; if (cameraResolution != null && thePreviewHandler != null) { Message message = thePreviewHandler.obtainMessage(previewMessage, cameraResolution.x, cameraResolution.y, data); Log.d(TAG, "Got preview callback, but no handler or resolution available");
QRConstants.java
public class QRConstants { public static boolean vibrateEnable = true; public static boolean beepEnable = true; public static FrontLightMode frontLightMode = FrontLightMode.OFF; public static boolean disableExposure = true; public static boolean autoFocus = true; public static final String KEY_AUTO_FOCUS = "preferences_auto_focus";
Constant.java
public static final int DECODE = 1; public static final int DECODE_FAILED = 2; public static final int DECODE_SUCCEEDED = 3; public static final int LAUNCH_PRODUCT_QUERY = 4; public static final int QUIT = 5; public static final int RESTART_PREVIEW = 6; public static final int RETURN_SCAN_RESULT = 7; public static final int FLASH_OPEN = 8; public static final int FLASH_CLOSE = 9; public static final int REQUEST_IMAGE = 10; public static final String CODED_CONTENT = "codedContent"; public static final String CODED_BITMAP = "codedBitmap"; public static final String INTENT_ZXING_CONFIG = "zxingConfig";
AmbientLightManager.java
public final class AmbientLightManager implements SensorEventListener { private static final float TOO_DARK_LUX = 45.0f; private static final float BRIGHT_ENOUGH_LUX = 450.0f; private final Context context; private CameraManager cameraManager; private Sensor lightSensor; public AmbientLightManager(Context context) { public void start(CameraManager cameraManager) { this.cameraManager = cameraManager; SharedPreferences sharedPrefs = PreferenceManager.getDefaultSharedPreferences(context); if (QRConstants.frontLightMode == FrontLightMode.AUTO) { SensorManager sensorManager = (SensorManager) context.getSystemService(Context.SENSOR_SERVICE); lightSensor = sensorManager.getDefaultSensor(Sensor.TYPE_LIGHT); if (lightSensor != null) { sensorManager.registerListener(this, lightSensor, SensorManager.SENSOR_DELAY_NORMAL); if (lightSensor != null) { SensorManager sensorManager = (SensorManager) context.getSystemService(Context.SENSOR_SERVICE); sensorManager.unregisterListener(this); public void onSensorChanged(SensorEvent sensorEvent) { float ambientLightLux = sensorEvent.values[0]; if (cameraManager != null) { if (ambientLightLux <= TOO_DARK_LUX) { cameraManager.setTorch(true); } else if (ambientLightLux >= BRIGHT_ENOUGH_LUX) { cameraManager.setTorch(false); public void onAccuracyChanged(Sensor sensor, int accuracy) {
BeepManager.java
public final class BeepManager implements MediaPlayer.OnCompletionListener, MediaPlayer.OnErrorListener, Closeable { private static final String TAG = BeepManager.class.getSimpleName(); private static final float BEEP_VOLUME = 0.10f; private static final long VIBRATE_DURATION = 200L; private final Activity activity; private MediaPlayer mediaPlayer; private boolean playBeep; public BeepManager(Activity activity) { this.activity = activity; public boolean isPlayBeep() { public void setPlayBeep(boolean playBeep) { this.playBeep = playBeep; public boolean isVibrate() { public void setVibrate(boolean vibrate) { public synchronized void updatePrefs() { if (playBeep && mediaPlayer == null) { // The volume on STREAM_SYSTEM is not adjustable, and users found it // so we now play on the music stream. // 設(shè)置activity音量控制鍵控制的音頻流 activity.setVolumeControlStream(AudioManager.STREAM_MUSIC); mediaPlayer = buildMediaPlayer(activity); @SuppressLint("MissingPermission") public synchronized void playBeepSoundAndVibrate() { if (playBeep && mediaPlayer != null) { Vibrator vibrator = (Vibrator) activity .getSystemService(Context.VIBRATOR_SERVICE); vibrator.vibrate(VIBRATE_DURATION); private MediaPlayer buildMediaPlayer(Context activity) { MediaPlayer mediaPlayer = new MediaPlayer(); mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC); mediaPlayer.setOnCompletionListener(this); mediaPlayer.setOnErrorListener(this); AssetFileDescriptor file = activity.getResources() .openRawResourceFd(R.raw.beep); mediaPlayer.setDataSource(file.getFileDescriptor(), file.getStartOffset(), file.getLength()); mediaPlayer.setVolume(BEEP_VOLUME, BEEP_VOLUME); } catch (IOException ioe) { public void onCompletion(MediaPlayer mp) { // When the beep has finished playing, rewind to queue up another one. public synchronized boolean onError(MediaPlayer mp, int what, int extra) { if (what == MediaPlayer.MEDIA_ERROR_SERVER_DIED) { // we are finished, so put up an appropriate error toast if required // possibly media player error, so release and recreate public synchronized void close() { if (mediaPlayer != null) {
CaptureActivity.java
public class CaptureActivity extends Activity implements SurfaceHolder.Callback, View.OnClickListener { AppCompatDelegate.setCompatVectorFromResourcesEnabled(true);//處理api 5.1以下手機(jī)不兼容問題 private static final String TAG = CaptureActivity.class.getSimpleName(); private ImageView mBackmImg; public int REQ_ID_GALLERY = 0; public static boolean isLightOn = false; private IMResUtil mImResUtil; public static void startAction(Activity activity, Bundle bundle, int requestCode) { Intent intent = new Intent(activity, CaptureActivity.class); intent.putExtras(bundle); activity.startActivityForResult(intent, requestCode); private CameraManager cameraManager; private CaptureActivityHandler handler; private ViewfinderView viewfinderView; private boolean hasSurface; private Collection<BarcodeFormat> decodeFormats; private String characterSet; private InactivityTimer inactivityTimer; private BeepManager beepManager; private AmbientLightManager ambientLightManager; private LinearLayout bottomLayout; private TextView flashLightTv; private ImageView flashLightIv; private LinearLayout flashLightLayout; private LinearLayout albumLayout; private ZxingConfig config; private ImageView img_phone; public ViewfinderView getViewfinderView() { public Handler getHandler() { public CameraManager getCameraManager() { protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); mImResUtil = new IMResUtil(this); setContentView(mImResUtil.getLayout("activity_device_qrcode_capture")); setColor(this, Color.BLACK); inactivityTimer = new InactivityTimer(this); ambientLightManager = new AmbientLightManager(this); config = (ZxingConfig) getIntent().getExtras().get(Constant.INTENT_ZXING_CONFIG); Log.i("config", e.toString()); config = new ZxingConfig(); beepManager = new BeepManager(this); beepManager.setPlayBeep(config.isPlayBeep()); beepManager.setVibrate(config.isShake()); initView(getIntent().getExtras()); onEvent(getIntent().getExtras()); private void initView(Bundle bundle) { if ("landscape".equals(bundle.getString("portraitOrLandscape"))) { setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE); mBackmImg = (ImageView) findViewById(mImResUtil.getId("iv_qr_back")); mTitle = (TextView) findViewById(mImResUtil.getId("tv_qr_title")); viewfinderView = (ViewfinderView) findViewById(mImResUtil.getId("vv_qr_viewfinderView")); flashLightTv = (TextView) findViewById(mImResUtil.getId("flashLightTv")); bottomLayout = (LinearLayout) findViewById(mImResUtil.getId("bottomLayout")); flashLightIv = (ImageView) findViewById(mImResUtil.getId("flashLightIv")); img_phone = (ImageView) findViewById(mImResUtil.getId("img_phone")); flashLightLayout = (LinearLayout) findViewById(mImResUtil.getId("flashLightLayout")); flashLightLayout.setOnClickListener(this); albumLayout = (LinearLayout) findViewById(mImResUtil.getId("albumLayout")); albumLayout.setOnClickListener(this); private void onEvent(Bundle bundle) { switchVisibility(bottomLayout, config.isShowbottomLayout()); switchVisibility(flashLightLayout, config.isShowFlashLight()); switchVisibility(albumLayout, config.isShowAlbum()); flashLightIv.setImageResource(R.drawable.device_qrcode_scan_flash_off); img_phone.setImageResource(R.drawable.ic_photo); if (isSupportCameraLedFlash(getPackageManager())) { flashLightLayout.setVisibility(View.VISIBLE); flashLightLayout.setVisibility(View.GONE); /********************新增 END*****************************/ mBackmImg.setOnClickListener(this); String titileText = bundle.getString("titileText"); if (titileText != null && !titileText.isEmpty()) { mTitle.setText(titileText); String headColor = bundle.getString("headColor"); mTitle.setTextColor(Color.parseColor(headColor)); float headSize = bundle.getFloat("headSize"); mTitle.setTextSize(headSize); public static void setColor(Activity activity, int color) { if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.KITKAT) { activity.getWindow().addFlags(WindowManager.LayoutParams.FLAG_TRANSLUCENT_STATUS); View statusView = createStatusView(activity, color); ViewGroup decorView = (ViewGroup) activity.getWindow().getDecorView(); decorView.addView(statusView); ViewGroup rootView = (ViewGroup) ((ViewGroup) activity.findViewById(android.R.id.content)).getChildAt(0); rootView.setFitsSystemWindows(true); rootView.setClipToPadding(true); private static View createStatusView(Activity activity, int color) { int resourceId = activity.getResources().getIdentifier("status_bar_height", "dimen", "android"); int statusBarHeight = activity.getResources().getDimensionPixelSize(resourceId); View statusView = new View(activity); LinearLayout.LayoutParams params = new LinearLayout.LayoutParams(ViewGroup.LayoutParams.MATCH_PARENT, statusBarHeight); statusView.setLayoutParams(params); statusView.setBackgroundColor(color); public void onClick(View v) { if (id == mImResUtil.getId("iv_qr_back")) { } else if (id == mImResUtil.getId("albumLayout")) {//打開相冊(cè) // Intent intent = new Intent(); // intent.setType("image/*"); // intent.setAction(Intent.ACTION_GET_CONTENT); // intent.addCategory(Intent.CATEGORY_OPENABLE); // startActivityForResult(intent, REQ_ID_GALLERY); Intent intent = new Intent(); intent.setAction(Intent.ACTION_PICK); intent.setType("image/*"); startActivityForResult(intent, REQ_ID_GALLERY); } else if (id == mImResUtil.getId("flashLightLayout")) {//打開手電筒感應(yīng)部分 cameraManager.offLight(); cameraManager.openLight(); protected void onActivityResult(int requestCode, int resultCode, Intent data) { super.onActivityResult(requestCode, resultCode, data); if (requestCode == REQ_ID_GALLERY) { if (resultCode == RESULT_OK) { Uri uri = data.getData(); String path = FileUtil.checkPicturePath(CaptureActivity.this, uri);//Device.getActivity() BitmapFactory.Options bmOptions = new BitmapFactory.Options(); bmOptions.inJustDecodeBounds = false; bmOptions.inPurgeable = true; Bitmap bmp = BitmapFactory.decodeFile(path, bmOptions); //----------------------------------------------------- if (requestCode == Constant.REQUEST_IMAGE && resultCode == RESULT_OK) { String path = ImageUtil.getImageAbsolutePath(this, data.getData()); Log.e(TAG, "onActivityResult: -------二維碼:path" + path); BitmapFactory.Options bmOptions = new BitmapFactory.Options(); bmOptions.inJustDecodeBounds = false; bmOptions.inPurgeable = true; Bitmap bmp = BitmapFactory.decodeFile(path, bmOptions); * @param bitmap 要解析的二維碼圖片 public final Map<DecodeHintType, Object> HINTS = new EnumMap<>(DecodeHintType.class); @SuppressLint("StaticFieldLeak") public void decodeQRCode(final Bitmap bitmap, final Activity activity) { new AsyncTask<Void, Void, String>() { protected String doInBackground(Void... params) { int width = bitmap.getWidth(); int height = bitmap.getHeight(); int[] pixels = new int[width * height]; bitmap.getPixels(pixels, 0, width, 0, 0, width, height); RGBLuminanceSource source = new RGBLuminanceSource(width, height, pixels); Result result = new MultiFormatReader().decode(new BinaryBitmap(new HybridBinarizer(source)), HINTS); if (result != null && result.getText() != null) { Intent resultIntent = new Intent(); Bundle bundle = new Bundle(); bundle.putString(Constant.CODED_CONTENT, url); resultIntent.putExtras(bundle); activity.setResult(RESULT_OK, resultIntent); CaptureActivity.this.finish(); protected void onPostExecute(String result) { Log.d("CaptureActivity", "result=" + result); Toast.makeText(CaptureActivity.this, "解析失敗,換個(gè)圖片試一下", Toast.LENGTH_LONG).show(); protected void onResume() { cameraManager = new CameraManager(getApplication()); viewfinderView.setCameraManager(cameraManager); beepManager.updatePrefs(); ambientLightManager.start(cameraManager); inactivityTimer.onResume(); SurfaceView surfaceView = (SurfaceView) findViewById(mImResUtil.getId("device_qrcode_preview_view")); SurfaceHolder surfaceHolder = surfaceView.getHolder(); initCamera(surfaceHolder); surfaceHolder.addCallback(this); protected void onPause() { handler.quitSynchronously(); inactivityTimer.onPause(); ambientLightManager.stop(); cameraManager.closeDriver(); SurfaceView surfaceView = (SurfaceView) findViewById(mImResUtil.getId("device_qrcode_preview_view")); SurfaceHolder surfaceHolder = surfaceView.getHolder(); surfaceHolder.removeCallback(this); protected void onDestroy() { inactivityTimer.shutdown(); private void initCamera(SurfaceHolder surfaceHolder) { if (surfaceHolder == null) { throw new IllegalStateException("No SurfaceHolder provided"); if (cameraManager.isOpen()) { Log.w(TAG, "initCamera() while already open -- late SurfaceView callback?"); cameraManager.openDriver(surfaceHolder); // Creating the handler starts the preview, which can also throw a RuntimeException. handler = new CaptureActivityHandler(this, decodeFormats, characterSet, cameraManager); } catch (IOException ioe) { } catch (RuntimeException e) { // Barcode Scanner has seen crashes in the wild of this variety: // java.?lang.?RuntimeException: Fail to connect to camera service Log.e(TAG, "Unexpected error initializing camera", e); public void drawViewfinder() { viewfinderView.drawViewfinder(); public void handleDecode(Result rawResult, Bitmap barcode, float scaleFactor) { Log.d("wxl", "rawResult=" + rawResult); boolean fromLiveScan = barcode != null; String resultString = rawResult.getText(); Log.e("wxl", "rawResult=" + rawResult.getText()); beepManager.playBeepSoundAndVibrate(); Intent resultIntent = new Intent(); Bundle bundle = new Bundle(); bundle.putString(Constant.CODED_CONTENT, resultString); resultIntent.putExtras(bundle); this.setResult(RESULT_OK, resultIntent); // this.setResult(RESULT_OK, resultIntent); Toast.makeText(CaptureActivity.this, "掃描失敗", Toast.LENGTH_SHORT).show(); CaptureActivity.this.finish(); public void surfaceCreated(SurfaceHolder holder) { Log.e(TAG, "*** WARNING *** surfaceCreated() gave us a null surface!"); public void surfaceDestroyed(SurfaceHolder holder) { public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) { /********************新增代碼 start: **************************/ private void switchVisibility(View view, boolean b) { view.setVisibility(View.VISIBLE); view.setVisibility(View.GONE); public static boolean isSupportCameraLedFlash(PackageManager pm) { FeatureInfo[] features = pm.getSystemAvailableFeatures(); for (FeatureInfo f : features) { if (f != null && PackageManager.FEATURE_CAMERA_FLASH.equals(f.name)) { * @param flashState 切換閃光燈圖片 public void switchFlashImg(int flashState) { if (flashState == Constant.FLASH_OPEN) { flashLightIv.setImageResource(R.drawable.device_qrcode_scan_flash_on);//ic_open flashLightTv.setText("關(guān)閉閃光燈"); flashLightIv.setImageResource(R.drawable.device_qrcode_scan_flash_off);//ic_close flashLightTv.setText("打開閃光燈"); /********************新增代碼 END*****************************/
CaptureActivityHandler.java
public final class CaptureActivityHandler extends Handler { private static final String TAG = CaptureActivityHandler.class.getSimpleName(); private final CaptureActivity activity; private final DecodeThread decodeThread; private final CameraManager cameraManager; private final IMResUtil mImResUtil; // private RelativeLayout mLightLinearLay;//LinearLayout 手電筒的線性布局 public CaptureActivityHandler(CaptureActivity activity, Collection<BarcodeFormat> decodeFormats, String characterSet, CameraManager cameraManager) { this.activity = activity; mImResUtil = new IMResUtil(activity); decodeThread = new DecodeThread(activity, decodeFormats, characterSet, new ViewfinderResultPointCallback(activity.getViewfinderView())); // Start ourselves capturing previews and decoding. this.cameraManager = cameraManager; cameraManager.startPreview(); Log.d(TAG, "CaptureActivityHandler " + CaptureActivityHandler.class.toString()); restartPreviewAndDecode(); public void handleMessage(Message message) { if (what == mImResUtil.getId("device_qrcode_restart_preview")) { restartPreviewAndDecode(); } else if (what == mImResUtil.getId("device_qrcode_decode_succeeded")) { Bundle bundle = message.getData(); float scaleFactor = 1.0f; byte[] compressedBitmap = bundle.getByteArray(DecodeThread.BARCODE_BITMAP); if (compressedBitmap != null) { barcode = BitmapFactory.decodeByteArray(compressedBitmap, 0, compressedBitmap.length, null); barcode = barcode.copy(Bitmap.Config.ARGB_8888, true); scaleFactor = bundle.getFloat(DecodeThread.BARCODE_SCALED_FACTOR); activity.handleDecode((Result) message.obj, barcode, scaleFactor); } else if (what == mImResUtil.getId("device_qrcode_decode_failed")) { cameraManager.requestPreviewFrame(decodeThread.getHandler(), mImResUtil.getId("device_qrcode_decode")); } else if (what == mImResUtil.getId("device_qrcode_return_scan_result")) { activity.setResult(Activity.RESULT_OK, (Intent) message.obj); } else if (what == mImResUtil.getId("device_qrcode_launch_product_query")) { String url = (String) message.obj; Intent intent = new Intent(Intent.ACTION_VIEW); intent.addFlags(Intent.FLAG_ACTIVITY_CLEAR_WHEN_TASK_RESET); intent.setData(Uri.parse(url)); ResolveInfo resolveInfo = activity.getPackageManager().resolveActivity(intent, PackageManager.MATCH_DEFAULT_ONLY); String browserPackageName = null; if (resolveInfo != null && resolveInfo.activityInfo != null) { browserPackageName = resolveInfo.activityInfo.packageName; Log.d(TAG, "Using browser in package " + browserPackageName); // Needed for default Android browser / Chrome only apparently if ("com.android.browser".equals(browserPackageName) || "com.android.chrome".equals(browserPackageName)) { intent.setPackage(browserPackageName); intent.addFlags(Intent.FLAG_ACTIVITY_NEW_TASK); intent.putExtra(Browser.EXTRA_APPLICATION_ID, browserPackageName); activity.startActivity(intent); } catch (ActivityNotFoundException ignored) { Log.w(TAG, "Can't find anything to handle VIEW of URI " + url); public void quitSynchronously() { cameraManager.stopPreview(); Message quit = Message.obtain(decodeThread.getHandler(), mImResUtil.getId("device_qrcode_quit")); // Wait at most half a second; should be enough time, and onPause() will timeout quickly } catch (InterruptedException e) { // Be absolutely sure we don't send any queued up messages removeMessages(mImResUtil.getId("device_qrcode_decode_succeeded")); removeMessages(mImResUtil.getId("device_qrcode_decode_failed")); private void restartPreviewAndDecode() { if (state == State.SUCCESS) { //decodeThread.getHandler()拿到DecodeHandler cameraManager.requestPreviewFrame(decodeThread.getHandler(), mImResUtil.getId("device_qrcode_decode")); activity.drawViewfinder();
DecodeFormatManager.java
final class DecodeFormatManager { private static final Pattern COMMA_PATTERN = Pattern.compile(","); // static final Set<BarcodeFormat> PRODUCT_FORMATS; // static final Set<BarcodeFormat> INDUSTRIAL_FORMATS; // private static final Set<BarcodeFormat> ONE_D_FORMATS; static final Set<BarcodeFormat> QR_CODE_FORMATS = EnumSet.of(BarcodeFormat.QR_CODE); // static final Set<BarcodeFormat> DATA_MATRIX_FORMATS = EnumSet.of(BarcodeFormat.DATA_MATRIX); // static final Set<BarcodeFormat> AZTEC_FORMATS = EnumSet.of(BarcodeFormat.AZTEC); // static final Set<BarcodeFormat> PDF417_FORMATS = EnumSet.of(BarcodeFormat.PDF_417); PRODUCT_FORMATS = EnumSet.of(BarcodeFormat.UPC_A, BarcodeFormat.RSS_EXPANDED); INDUSTRIAL_FORMATS = EnumSet.of(BarcodeFormat.CODE_39, ONE_D_FORMATS = EnumSet.copyOf(PRODUCT_FORMATS); ONE_D_FORMATS.addAll(INDUSTRIAL_FORMATS); private static final Map<String,Set<BarcodeFormat>> FORMATS_FOR_MODE; FORMATS_FOR_MODE = new HashMap<>(); // FORMATS_FOR_MODE.put(Intents.Scan.ONE_D_MODE, ONE_D_FORMATS); // FORMATS_FOR_MODE.put(Intents.Scan.PRODUCT_MODE, PRODUCT_FORMATS); FORMATS_FOR_MODE.put(Intents.Scan.QR_CODE_MODE, QR_CODE_FORMATS); // FORMATS_FOR_MODE.put(Intents.Scan.DATA_MATRIX_MODE, DATA_MATRIX_FORMATS); // FORMATS_FOR_MODE.put(Intents.Scan.AZTEC_MODE, AZTEC_FORMATS); // FORMATS_FOR_MODE.put(Intents.Scan.PDF417_MODE, PDF417_FORMATS); private DecodeFormatManager() {} static Set<BarcodeFormat> parseDecodeFormats(Intent intent) { Iterable<String> scanFormats = null; CharSequence scanFormatsString = intent.getStringExtra(Intents.Scan.FORMATS); if (scanFormatsString != null) { scanFormats = Arrays.asList(COMMA_PATTERN.split(scanFormatsString)); return parseDecodeFormats(scanFormats, intent.getStringExtra(Intents.Scan.MODE)); static Set<BarcodeFormat> parseDecodeFormats(Uri inputUri) { List<String> formats = inputUri.getQueryParameters(Intents.Scan.FORMATS); if (formats != null && formats.size() == 1 && formats.get(0) != null){ formats = Arrays.asList(COMMA_PATTERN.split(formats.get(0))); return parseDecodeFormats(formats, inputUri.getQueryParameter(Intents.Scan.MODE)); private static Set<BarcodeFormat> parseDecodeFormats(Iterable<String> scanFormats, String decodeMode) { if (scanFormats != null) { Set<BarcodeFormat> formats = EnumSet.noneOf(BarcodeFormat.class); for (String format : scanFormats) { formats.add(BarcodeFormat.valueOf(format)); } catch (IllegalArgumentException iae) { if (decodeMode != null) { return FORMATS_FOR_MODE.get(decodeMode);
DecodeHandler.java
final class DecodeHandler extends Handler { private static final String TAG = DecodeHandler.class.getSimpleName(); public static boolean isWeakLight = false; private final CaptureActivity activity; private final MultiFormatReader multiFormatReader; private boolean running = true; private final IMResUtil mImResUtil; DecodeHandler(CaptureActivity activity, Map<DecodeHintType, Object> hints) { multiFormatReader = new MultiFormatReader(); multiFormatReader.setHints(hints); this.activity = activity; mImResUtil = new IMResUtil(activity); public void handleMessage(Message message) { if (what == mImResUtil.getId("device_qrcode_decode")) { decode((byte[]) message.obj, message.arg1, message.arg2); } else if (what == mImResUtil.getId("device_qrcode_quit")) { CaptureActivity.isLightOn = false; Looper.myLooper().quit(); * Decode the data within the viewfinder rectangle, and time how long it took. For efficiency, * reuse the same reader objects from one decode to the next. * @param data The YUV preview frame. * @param width The width of the preview frame. * @param height The height of the preview frame. private void decode(byte[] data, int width, int height) { analysisColor(data,width,height); long start = System.currentTimeMillis(); PlanarYUVLuminanceSource source = activity.getCameraManager().buildLuminanceSource(data, width, height); BinaryBitmap bitmap = new BinaryBitmap(new HybridBinarizer(source)); rawResult = multiFormatReader.decodeWithState(bitmap); } catch (ReaderException re) { multiFormatReader.reset(); Handler handler = activity.getHandler();//拿到CaptureActivityHandler Log.d(TAG, "Found handler " + handler); // Don't log the barcode contents for security. long end = System.currentTimeMillis(); Log.d(TAG, "Found barcode in " + (end - start) + " ms"); //給CaptureActivityHandler發(fā)消息 Message message = Message.obtain(handler, mImResUtil.getId("device_qrcode_decode_succeeded"), rawResult); Bundle bundle = new Bundle(); bundleThumbnail(source, bundle); Message message = Message.obtain(handler, mImResUtil.getId("device_qrcode_decode_failed")); private static void bundleThumbnail(PlanarYUVLuminanceSource source, Bundle bundle) { int[] pixels = source.renderThumbnail(); int width = source.getThumbnailWidth(); int height = source.getThumbnailHeight(); Bitmap bitmap = Bitmap.createBitmap(pixels, 0, width, width, height, Bitmap.Config.ARGB_8888); ByteArrayOutputStream out = new ByteArrayOutputStream(); bitmap.compress(Bitmap.CompressFormat.JPEG, 50, out); bundle.putByteArray(DecodeThread.BARCODE_BITMAP, out.toByteArray()); bundle.putFloat(DecodeThread.BARCODE_SCALED_FACTOR, (float) width / source.getWidth()); private int[] decodeYUV420SP(byte[] yuv420sp, int width, int height) { final int frameSize = width * height; int rgb[] = new int[width * height]; for (int j = 0, yp = 0; j < height; j++) { int uvp = frameSize + (j >> 1) * width, u = 0, v = 0; for (int i = 0; i < width; i++, yp++) { int y = (0xff & ((int) yuv420sp[yp])) - 16; v = (0xff & yuv420sp[uvp++]) - 128; u = (0xff & yuv420sp[uvp++]) - 128; int r = (y1192 + 1634 * v); int g = (y1192 - 833 * v - 400 * u); int b = (y1192 + 2066 * u); else if (r > 262143) r = 262143; else if (g > 262143) g = 262143; else if (b > 262143) b = 262143; rgb[yp] = 0xff000000 | ((r << 6) & 0xff0000) | ((g >> 2) & 0xff00) | ((b >> 10) & 0xff); private int getAverageColor(Bitmap bitmap) { for (int y = 0; y < bitmap.getHeight(); y++) { for (int x = 0; x < bitmap.getWidth(); x++) { int c = bitmap.getPixel(x, y); redBucket += Color.red(c); greenBucket += Color.green(c); blueBucket += Color.blue(c); int averageColor = Color.rgb(redBucket / pixelCount, greenBucket / pixelCount, blueBucket / pixelCount); public void analysisColor(byte[] data, int width, int height) { int[] rgb = decodeYUV420SP(data, width / 8, height / 8); Bitmap bmp = Bitmap.createBitmap(rgb, width / 8, height / 8, Bitmap.Config.ARGB_8888);//這里報(bào)錯(cuò) //取以中心點(diǎn)寬高10像素的圖片來分析 Bitmap resizeBitmap = Bitmap.createBitmap(bmp, bmp.getWidth() / 2, bmp.getHeight() / 2, 10, 10); float color = (float) getAverageColor(resizeBitmap); DecimalFormat decimalFormat1 = new DecimalFormat("0.00"); String percent = decimalFormat1.format(color / -16777216); float floatPercent = Float.parseFloat(percent); isWeakLight = floatPercent >= 0.99 && floatPercent <= 1.00; // Log.i(TAG,"isWeakLight "+isWeakLight); if (null != resizeBitmap) {
DecodeHintManager.java
final class DecodeHintManager { private static final String TAG = DecodeHintManager.class.getSimpleName(); // This pattern is used in decoding integer arrays. private static final Pattern COMMA = Pattern.compile(","); private DecodeHintManager() {} * <p>Split a query string into a list of name-value pairs.</p> * <p>This is an alternative to the {@link Uri#getQueryParameterNames()} and * {@link Uri#getQueryParameters(String)}, which are quirky and not suitable * for exist-only Uri parameters.</p> * <p>This method ignores multiple parameters with the same name and returns the * first one only. This is technically incorrect, but should be acceptable due * to the method of processing Hints: no multiple values for a hint.</p> * @param query query to split * @return name-value pairs private static Map<String,String> splitQuery(String query) { Map<String,String> map = new HashMap<>(); while (pos < query.length()) { if (query.charAt(pos) == '&') { // Skip consecutive ampersand separators. int amp = query.indexOf('&', pos); int equ = query.indexOf('=', pos); // This is the last element in the query, no more ampersand elements. name = query.substring(pos); name = name.replace('+', ' '); // Preemptively decode + name = query.substring(pos, equ); name = name.replace('+', ' '); // Preemptively decode + text = query.substring(equ + 1); text = text.replace('+', ' '); // Preemptively decode + if (!map.containsKey(name)) { if (equ < 0 || equ > amp) { // No equal sign until the &: this is a simple parameter with no value. String name = query.substring(pos, amp); name = name.replace('+', ' '); // Preemptively decode + if (!map.containsKey(name)) { String name = query.substring(pos, equ); name = name.replace('+', ' '); // Preemptively decode + String text = query.substring(equ+1, amp); text = text.replace('+', ' '); // Preemptively decode + if (!map.containsKey(name)) { static Map<DecodeHintType,?> parseDecodeHints(Uri inputUri) { String query = inputUri.getEncodedQuery(); if (query == null || query.isEmpty()) { Map<String, String> parameters = splitQuery(query); Map<DecodeHintType, Object> hints = new EnumMap<>(DecodeHintType.class); for (DecodeHintType hintType: DecodeHintType.values()) { if (hintType == DecodeHintType.CHARACTER_SET || hintType == DecodeHintType.NEED_RESULT_POINT_CALLBACK || hintType == DecodeHintType.POSSIBLE_FORMATS) { continue; // This hint is specified in another way String parameterName = hintType.name(); String parameterText = parameters.get(parameterName); if (parameterText == null) { if (hintType.getValueType().equals(Object.class)) { // This is an unspecified type of hint content. Use the value as is. // TODO: Can we make a different assumption on this? hints.put(hintType, parameterText); if (hintType.getValueType().equals(Void.class)) { // Void hints are just flags: use the constant specified by DecodeHintType hints.put(hintType, Boolean.TRUE); if (hintType.getValueType().equals(String.class)) { // A string hint: use the decoded value. hints.put(hintType, parameterText); if (hintType.getValueType().equals(Boolean.class)) { // A boolean hint: a few values for false, everything else is true. // An empty parameter is simply a flag-style parameter, assuming true if (parameterText.isEmpty()) { hints.put(hintType, Boolean.TRUE); } else if ("0".equals(parameterText) || "false".equalsIgnoreCase(parameterText) || "no".equalsIgnoreCase(parameterText)) { hints.put(hintType, Boolean.FALSE); hints.put(hintType, Boolean.TRUE); if (hintType.getValueType().equals(int[].class)) { // An integer array. Used to specify valid lengths. // Strip a trailing comma as in Java style array initialisers. if (!parameterText.isEmpty() && parameterText.charAt(parameterText.length() - 1) == ',') { parameterText = parameterText.substring(0, parameterText.length() - 1); String[] values = COMMA.split(parameterText); int[] array = new int[values.length]; for (int i = 0; i < values.length; i++) { array[i] = Integer.parseInt(values[i]); } catch (NumberFormatException ignored) { Log.w(TAG, "Skipping array of integers hint " + hintType + " due to invalid numeric value: '" + values[i] + '\''); hints.put(hintType, array); Log.w(TAG, "Unsupported hint type '" + hintType + "' of type " + hintType.getValueType()); Log.i(TAG, "Hints from the URI: " + hints); static Map<DecodeHintType, Object> parseDecodeHints(Intent intent) { Bundle extras = intent.getExtras(); if (extras == null || extras.isEmpty()) { Map<DecodeHintType,Object> hints = new EnumMap<>(DecodeHintType.class); for (DecodeHintType hintType: DecodeHintType.values()) { if (hintType == DecodeHintType.CHARACTER_SET || hintType == DecodeHintType.NEED_RESULT_POINT_CALLBACK || hintType == DecodeHintType.POSSIBLE_FORMATS) { continue; // This hint is specified in another way String hintName = hintType.name(); if (extras.containsKey(hintName)) { if (hintType.getValueType().equals(Void.class)) { // Void hints are just flags: use the constant specified by the DecodeHintType hints.put(hintType, Boolean.TRUE); Object hintData = extras.get(hintName); if (hintType.getValueType().isInstance(hintData)) { hints.put(hintType, hintData); Log.w(TAG, "Ignoring hint " + hintType + " because it is not assignable from " + hintData); Log.i(TAG, "Hints from the Intent: " + hints);
DecodeThread.java
final class DecodeThread extends Thread { public static final String BARCODE_BITMAP = "barcode_bitmap"; public static final String BARCODE_SCALED_FACTOR = "barcode_scaled_factor"; private final CaptureActivity activity; private final Map<DecodeHintType, Object> hints; private final CountDownLatch handlerInitLatch; public DecodeThread(CaptureActivity activity, Collection<BarcodeFormat> decodeFormats, ResultPointCallback resultPointCallback) { this.activity = activity; handlerInitLatch = new CountDownLatch(1); hints = new EnumMap<>(DecodeHintType.class); // The prefs can't change while the thread is running, so pick them up once here. if (decodeFormats == null || decodeFormats.isEmpty()) { // SharedPreferences prefs = PreferenceManager.getDefaultSharedPreferences(activity); decodeFormats = EnumSet.noneOf(BarcodeFormat.class); /*if (prefs.getBoolean(PreferencesActivity.KEY_DECODE_1D_PRODUCT, true)) { decodeFormats.addAll(DecodeFormatManager.PRODUCT_FORMATS); if (prefs.getBoolean(PreferencesActivity.KEY_DECODE_1D_INDUSTRIAL, true)) { decodeFormats.addAll(DecodeFormatManager.INDUSTRIAL_FORMATS); // if (prefs.getBoolean(PreferencesActivity.KEY_DECODE_QR, true)) { decodeFormats.addAll(DecodeFormatManager.QR_CODE_FORMATS); /* if (prefs.getBoolean(PreferencesActivity.KEY_DECODE_DATA_MATRIX, true)) { decodeFormats.addAll(DecodeFormatManager.DATA_MATRIX_FORMATS); if (prefs.getBoolean(PreferencesActivity.KEY_DECODE_AZTEC, false)) { decodeFormats.addAll(DecodeFormatManager.AZTEC_FORMATS); if (prefs.getBoolean(PreferencesActivity.KEY_DECODE_PDF417, false)) { decodeFormats.addAll(DecodeFormatManager.PDF417_FORMATS); hints.put(DecodeHintType.POSSIBLE_FORMATS, DecodeFormatManager.QR_CODE_FORMATS); if (characterSet != null) { hints.put(DecodeHintType.CHARACTER_SET, characterSet); hints.put(DecodeHintType.NEED_RESULT_POINT_CALLBACK, resultPointCallback); Log.i("DecodeThread", "Hints: " + hints); handlerInitLatch.await(); } catch (InterruptedException ie) { handler = new DecodeHandler(activity, hints); handlerInitLatch.countDown();
ImageUtil.java
public static String getImageAbsolutePath(Context context, Uri imageUri) { if (context == null || imageUri == null) if (android.os.Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.KITKAT && DocumentsContract.isDocumentUri(context, imageUri)) { if (isExternalStorageDocument(imageUri)) { String docId = DocumentsContract.getDocumentId(imageUri); String[] split = docId.split(":"); if ("primary".equalsIgnoreCase(type)) { return Environment.getExternalStorageDirectory() + "/" + split[1]; } else if (isDownloadsDocument(imageUri)) { String id = DocumentsContract.getDocumentId(imageUri); Uri contentUri = ContentUris.withAppendedId(Uri.parse("content://downloads/public_downloads"), Long.valueOf(id)); return getDataColumn(context, contentUri, null, null); } else if (isMediaDocument(imageUri)) { String docId = DocumentsContract.getDocumentId(imageUri); String[] split = docId.split(":"); if ("image".equals(type)) { contentUri = MediaStore.Images.Media.EXTERNAL_CONTENT_URI; } else if ("video".equals(type)) { contentUri = MediaStore.Video.Media.EXTERNAL_CONTENT_URI; } else if ("audio".equals(type)) { contentUri = MediaStore.Audio.Media.EXTERNAL_CONTENT_URI; String selection = MediaStore.Images.Media._ID + "=?"; String[] selectionArgs = new String[]{split[1]}; return getDataColumn(context, contentUri, selection, selectionArgs); } // MediaStore (and general) else if ("content".equalsIgnoreCase(imageUri.getScheme())) { // Return the remote address if (isGooglePhotosUri(imageUri)) return imageUri.getLastPathSegment(); return getDataColumn(context, imageUri, null, null); else if ("file".equalsIgnoreCase(imageUri.getScheme())) { return imageUri.getPath(); public static String getDataColumn(Context context, Uri uri, String selection, String[] selectionArgs) { String column = MediaStore.Images.Media.DATA; String[] projection = {column}; cursor = context.getContentResolver().query(uri, projection, selection, selectionArgs, null); if (cursor != null && cursor.moveToFirst()) { int index = cursor.getColumnIndexOrThrow(column); return cursor.getString(index); * @param uri The Uri to check. * @return Whether the Uri authority is ExternalStorageProvider. public static boolean isExternalStorageDocument(Uri uri) { return "com.android.externalstorage.documents".equals(uri.getAuthority()); * @param uri The Uri to check. * @return Whether the Uri authority is DownloadsProvider. public static boolean isDownloadsDocument(Uri uri) { return "com.android.providers.downloads.documents".equals(uri.getAuthority()); * @param uri The Uri to check. * @return Whether the Uri authority is MediaProvider. public static boolean isMediaDocument(Uri uri) { return "com.android.providers.media.documents".equals(uri.getAuthority()); * @param uri The Uri to check. * @return Whether the Uri authority is Google Photos. public static boolean isGooglePhotosUri(Uri uri) { return "com.google.android.apps.photos.content".equals(uri.getAuthority());
InactivityTimer.java
public final class InactivityTimer { private static final String TAG = InactivityTimer.class.getSimpleName(); private static final long INACTIVITY_DELAY_MS = 5 * 60 * 1000L; private final Activity activity; private final BroadcastReceiver powerStatusReceiver; private boolean registered; private AsyncTask<Object, Object, Object> inactivityTask; public InactivityTimer(Activity activity) { this.activity = activity; powerStatusReceiver = new PowerStatusReceiver(); public synchronized void onActivity() { inactivityTask = new InactivityAsyncTask(); inactivityTask.executeOnExecutor(AsyncTask.THREAD_POOL_EXECUTOR); public synchronized void onPause() { activity.unregisterReceiver(powerStatusReceiver); Log.w(TAG, "PowerStatusReceiver was never registered?"); public synchronized void onResume() { Log.w(TAG, "PowerStatusReceiver was already registered?"); activity.registerReceiver(powerStatusReceiver, new IntentFilter(Intent.ACTION_BATTERY_CHANGED)); private synchronized void cancel() { AsyncTask<?, ?, ?> task = inactivityTask; private final class PowerStatusReceiver extends BroadcastReceiver { public void onReceive(Context context, Intent intent) { if (Intent.ACTION_BATTERY_CHANGED.equals(intent.getAction())) { // 0 indicates that we're on battery boolean onBatteryNow = intent.getIntExtra(BatteryManager.EXTRA_PLUGGED, -1) <= 0; InactivityTimer.this.onActivity(); InactivityTimer.this.cancel(); private final class InactivityAsyncTask extends AsyncTask<Object, Object, Object> { protected Object doInBackground(Object... objects) { Thread.sleep(INACTIVITY_DELAY_MS); Log.i(TAG, "Finishing activity due to inactivity"); } catch (InterruptedException e) { // continue without killing
Intents.java
public final class Intents { public static final class Scan { * Send this intent to open the Barcodes app in scanning mode, find a barcode, and return public static final String ACTION = "com.google.zxing.client.android.SCAN"; * By default, sending this will decode all barcodes that we understand. However it * may be useful to limit scanning to certain formats. Use * {@link android.content.Intent#putExtra(String, String)} with one of the values below. * Setting this is effectively shorthand for setting explicit formats with {@link #FORMATS}. * It is overridden by that setting. public static final String MODE = "SCAN_MODE"; * Decode only UPC and EAN barcodes. This is the right choice for shopping apps which get * prices, reviews, etc. for products. public static final String PRODUCT_MODE = "PRODUCT_MODE"; * Decode only 1D barcodes. public static final String ONE_D_MODE = "ONE_D_MODE"; public static final String QR_CODE_MODE = "QR_CODE_MODE"; * Decode only Data Matrix codes. public static final String DATA_MATRIX_MODE = "DATA_MATRIX_MODE"; public static final String AZTEC_MODE = "AZTEC_MODE"; public static final String PDF417_MODE = "PDF417_MODE"; * Comma-separated list of formats to scan for. The values must match the names of * {@link com.google.zxing.BarcodeFormat}s, e.g. {@link com.google.zxing.BarcodeFormat#EAN_13}. * Example: "EAN_13,EAN_8,QR_CODE". This overrides {@link #MODE}. public static final String FORMATS = "SCAN_FORMATS"; * Optional parameter to specify the id of the camera from which to recognize barcodes. * Overrides the default camera that would otherwise would have been selected. * If provided, should be an int. public static final String CAMERA_ID = "SCAN_CAMERA_ID"; * @see com.google.zxing.DecodeHintType#CHARACTER_SET public static final String CHARACTER_SET = "CHARACTER_SET"; * Optional parameters to specify the width and height of the scanning rectangle in pixels. * The app will try to honor these, but will clamp them to the size of the preview frame. * You should specify both or neither, and pass the size as an int. public static final String WIDTH = "SCAN_WIDTH"; public static final String HEIGHT = "SCAN_HEIGHT"; * Desired duration in milliseconds for which to pause after a successful scan before * returning to the calling intent. Specified as a long, not an integer! * For example: 1000L, not 1000. public static final String RESULT_DISPLAY_DURATION_MS = "RESULT_DISPLAY_DURATION_MS"; * Prompt to show on-screen when scanning by intent. Specified as a {@link String}. public static final String PROMPT_MESSAGE = "PROMPT_MESSAGE"; * If a barcode is found, Barcodes returns {@link android.app.Activity#RESULT_OK} to * {@link android.app.Activity#onActivityResult(int, int, android.content.Intent)} * of the app which requested the scan via * {@link android.app.Activity#startActivityForResult(android.content.Intent, int)} * The barcodes contents can be retrieved with * {@link android.content.Intent#getStringExtra(String)}. * If the user presses Back, the result code will be {@link android.app.Activity#RESULT_CANCELED}. public static final String RESULT = "SCAN_RESULT"; * Call {@link android.content.Intent#getStringExtra(String)} with {@link #RESULT_FORMAT} * to determine which barcode format was found. * See {@link com.google.zxing.BarcodeFormat} for possible values. public static final String RESULT_FORMAT = "SCAN_RESULT_FORMAT"; * Call {@link android.content.Intent#getStringExtra(String)} with {@link #RESULT_UPC_EAN_EXTENSION} * to return the content of any UPC extension barcode that was also found. Only applicable * to {@link com.google.zxing.BarcodeFormat#UPC_A} and {@link com.google.zxing.BarcodeFormat#EAN_13} public static final String RESULT_UPC_EAN_EXTENSION = "SCAN_RESULT_UPC_EAN_EXTENSION"; * Call {@link android.content.Intent#getByteArrayExtra(String)} with {@link #RESULT_BYTES} * to get a {@code byte[]} of raw bytes in the barcode, if available. public static final String RESULT_BYTES = "SCAN_RESULT_BYTES"; * Key for the value of {@link com.google.zxing.ResultMetadataType#ORIENTATION}, if available. * Call {@link android.content.Intent#getIntArrayExtra(String)} with {@link #RESULT_ORIENTATION}. public static final String RESULT_ORIENTATION = "SCAN_RESULT_ORIENTATION"; * Key for the value of {@link com.google.zxing.ResultMetadataType#ERROR_CORRECTION_LEVEL}, if available. * Call {@link android.content.Intent#getStringExtra(String)} with {@link #RESULT_ERROR_CORRECTION_LEVEL}. public static final String RESULT_ERROR_CORRECTION_LEVEL = "SCAN_RESULT_ERROR_CORRECTION_LEVEL"; * Prefix for keys that map to the values of {@link com.google.zxing.ResultMetadataType#BYTE_SEGMENTS}, * if available. The actual values will be set under a series of keys formed by adding 0, 1, 2, ... * to this prefix. So the first byte segment is under key "SCAN_RESULT_BYTE_SEGMENTS_0" for example. * Call {@link android.content.Intent#getByteArrayExtra(String)} with these keys. public static final String RESULT_BYTE_SEGMENTS_PREFIX = "SCAN_RESULT_BYTE_SEGMENTS_"; * Setting this to false will not save scanned codes in the history. Specified as a {@code boolean}. public static final String SAVE_HISTORY = "SAVE_HISTORY"; public static final class History { public static final String ITEM_NUMBER = "ITEM_NUMBER"; public static final class Encode { * Send this intent to encode a piece of data as a QR code and display it full screen, so * that another person can scan the barcode from your screen. public static final String ACTION = "com.google.zxing.client.android.ENCODE"; * The data to encode. Use {@link android.content.Intent#putExtra(String, String)} or * {@link android.content.Intent#putExtra(String, android.os.Bundle)}, * depending on the type and format specified. Non-QR Code formats should * just use a String here. For QR Code, see Contents for details. public static final String DATA = "ENCODE_DATA"; * The type of data being supplied if the format is QR Code. Use * {@link android.content.Intent#putExtra(String, String)} with one of {@link Contents.Type}. public static final String TYPE = "ENCODE_TYPE"; * The barcode format to be displayed. If this isn't specified or is blank, * it defaults to QR Code. Use {@link android.content.Intent#putExtra(String, String)}, where * format is one of {@link com.google.zxing.BarcodeFormat}. public static final String FORMAT = "ENCODE_FORMAT"; * Normally the contents of the barcode are displayed to the user in a TextView. Setting this * boolean to false will hide that TextView, showing only the encode barcode. public static final String SHOW_CONTENTS = "ENCODE_SHOW_CONTENTS"; public static final class SearchBookContents { * Use Google Book Search to search the contents of the book provided. public static final String ACTION = "com.google.zxing.client.android.SEARCH_BOOK_CONTENTS"; * The book to search, identified by ISBN number. public static final String ISBN = "ISBN"; * An optional field which is the text to search for. public static final String QUERY = "QUERY"; private SearchBookContents() { public static final class WifiConnect { * Internal intent used to trigger connection to a wi-fi network. public static final String ACTION = "com.google.zxing.client.android.WIFI_CONNECT"; * The network to connect to, all the configuration provided here. public static final String SSID = "SSID"; * The network to connect to, all the configuration provided here. public static final String TYPE = "TYPE"; * The network to connect to, all the configuration provided here. public static final String PASSWORD = "PASSWORD"; public static final class Share { * Give the user a choice of items to encode as a barcode, then render it as a QR Code and * display onscreen for a friend to scan with their phone. public static final String ACTION = "com.google.zxing.client.android.SHARE";
ViewfinderResultPointCallback.java
final class ViewfinderResultPointCallback implements ResultPointCallback { private final ViewfinderView viewfinderView; ViewfinderResultPointCallback(ViewfinderView viewfinderView) { this.viewfinderView = viewfinderView; public void foundPossibleResultPoint(ResultPoint point) { viewfinderView.addPossibleResultPoint(point);
CodeCreator.java
public class CodeCreator { private static Bitmap logoBitmap; public static Bitmap createQRCode(String content, int w, int h,Bitmap logo) throws WriterException { if (TextUtils.isEmpty(content)) { Matrix matrix = new Matrix(); float scaleFactor = Math.min(w * 1.0f / 5 / logo.getWidth(), h * 1.0f / 5 /logo.getHeight()); matrix.postScale(scaleFactor,scaleFactor); logoBitmap= Bitmap.createBitmap(logo, 0, 0, logo.getWidth(), logo.getHeight(), matrix, true); /*如果log不為null,重新計(jì)算偏移量*/ if (logoBitmap != null) { logoW = logoBitmap.getWidth(); logoH = logoBitmap.getHeight(); offsetX = (w - logoW) / 2; offsetY = (h - logoH) / 2; Hashtable<EncodeHintType, Object> hints = new Hashtable<EncodeHintType, Object>(); hints.put(EncodeHintType.CHARACTER_SET, "utf-8"); hints.put(EncodeHintType.ERROR_CORRECTION, ErrorCorrectionLevel.H); hints.put(EncodeHintType.MARGIN, 0); // 生成二維矩陣,編碼時(shí)指定大小,不要生成了圖片以后再進(jìn)行縮放,這樣會(huì)模糊導(dǎo)致識(shí)別失敗 BitMatrix matrix = new MultiFormatWriter().encode(content, BarcodeFormat.QR_CODE, w, h, hints); // 二維矩陣轉(zhuǎn)為一維像素?cái)?shù)組,也就是一直橫著排了 int[] pixels = new int[w * h]; for (int y = 0; y < h; y++) { for (int x = 0; x < w; x++) { if(x >= offsetX && x < offsetX + logoW && y>= offsetY && y < offsetY + logoH){ int pixel = logoBitmap.getPixel(x-offsetX,y-offsetY); pixels[y * w + x] = pixel; pixels[y * w + x] = 0xff000000; pixels[y * w + x] = 0xffffffff; Bitmap bitmap = Bitmap.createBitmap(w, h, Bitmap.Config.ARGB_8888); bitmap.setPixels(pixels, 0, w, 0, 0, w, h);
QRCodeEncoder.java
public final class QRCodeEncoder { private static final String TAG = QRCodeEncoder.class.getSimpleName(); private static final int WHITE = 0xFFFFFFFF; private static final int BLACK = 0xFF000000; public static Bitmap encodeAsBitmap(String contents, int dimension) throws WriterException { String contentsToEncode = contents; if (contentsToEncode == null) { Map<EncodeHintType, Object> hints = null; String encoding = guessAppropriateEncoding(contentsToEncode); hints = new EnumMap<>(EncodeHintType.class); hints.put(EncodeHintType.CHARACTER_SET, encoding); result = new MultiFormatWriter().encode(contentsToEncode, BarcodeFormat.QR_CODE, dimension, dimension, hints); } catch (IllegalArgumentException iae) { int width = result.getWidth(); int height = result.getHeight(); int[] pixels = new int[width * height]; for (int y = 0; y < height; y++) { for (int x = 0; x < width; x++) { pixels[offset + x] = result.get(x, y) ? BLACK : WHITE; Bitmap bitmap = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888); bitmap.setPixels(pixels, 0, width, 0, 0, width, height); private static String guessAppropriateEncoding(CharSequence contents) { // Very crude at the moment for (int i = 0; i < contents.length(); i++) { if (contents.charAt(i) > 0xFF) {
ViewfinderResultPointCallback.java
public final class ViewfinderResultPointCallback implements ResultPointCallback { private final ViewfinderView viewfinderView; public ViewfinderResultPointCallback(ViewfinderView viewfinderView) { this.viewfinderView = viewfinderView; public void foundPossibleResultPoint(ResultPoint point) { viewfinderView.addPossibleResultPoint(point);
ViewfinderView.java
public final class ViewfinderView extends View{ private static final long ANIMATION_DELAY = 10; private static final int CORNER_RECT_HEIGHT = 40; private static final int CORNER_RECT_WIDTH = 8; private static final int OPAQUE = 255; private static final int[] SCANNER_ALPHA = new int[]{0, 64, 128, 192, OPAQUE, 192, 128, 64}; private static final int SCANNER_LINE_HEIGHT = 10; private static final int SCANNER_LINE_MOVE_DISTANCE = 5; public static int scannerEnd = 0; public static int scannerStart = 0; private CameraManager cameraManager; private final int cornerColor; private final int frameColor; private String labelText; private int labelTextColor; private float labelTextSize; private final int laserColor; private Collection<ResultPoint> lastPossibleResultPoints; private final IMResUtil mImResUtil; private final int maskColor; private final Paint paint = new Paint(); private Collection<ResultPoint> possibleResultPoints; private Bitmap resultBitmap; private final int resultColor; private final int resultPointColor; private int scannerAlpha; public ViewfinderView(Context context, AttributeSet attrs) { this.mImResUtil = new IMResUtil(context); TypedArray array = context.obtainStyledAttributes(attrs, this.mImResUtil.getStyleableArray("ViewfinderView")); this.laserColor = array.getColor(this.mImResUtil.getStyleable("ViewfinderView_device_qrcode_laser_color"), 65280); this.cornerColor = array.getColor(this.mImResUtil.getStyleable("ViewfinderView_device_qrcode_corner_color"), 65280); this.frameColor = array.getColor(this.mImResUtil.getStyleable("ViewfinderView_device_qrcode_frame_color"), 16777215); this.resultPointColor = array.getColor(this.mImResUtil.getStyleable("ViewfinderView_device_qrcode_result_point_color"), -1056964864); this.maskColor = array.getColor(this.mImResUtil.getStyleable("ViewfinderView_device_qrcode_mask_color"), 1610612736); this.resultColor = array.getColor(this.mImResUtil.getStyleable("ViewfinderView_device_qrcode_result_color"), -1342177280); this.labelTextColor = array.getColor(this.mImResUtil.getStyleable("ViewfinderView_device_qrcode_label_text_color"), -1862270977); this.labelText = array.getString(this.mImResUtil.getStyleable("ViewfinderView_device_qrcode_label_text")); this.labelTextSize = (float) array.getDimensionPixelSize(this.mImResUtil.getStyleable("ViewfinderView_device_qrcode_label_text_size"), (int) TypedValue.applyDimension(2, 16.0f, getResources().getDisplayMetrics())); this.paint.setAntiAlias(true); this.possibleResultPoints = new HashSet(5); public void setCameraManager(CameraManager cameraManager) { this.cameraManager = cameraManager; public void onDraw(Canvas canvas) { Rect frame = this.cameraManager.getFramingRect(); if (scannerStart == 0 || scannerEnd == 0) { scannerStart = frame.top; scannerEnd = frame.bottom; drawExterior(canvas, frame, canvas.getWidth(), canvas.getHeight()); if (this.resultBitmap != null) { this.paint.setAlpha(OPAQUE); canvas.drawBitmap(this.resultBitmap, (float) frame.left, (float) frame.top, this.paint); drawFrame(canvas, frame); drawCorner(canvas, frame); drawLaserScanner(canvas, frame); drawTextInfo(canvas, frame); Collection<ResultPoint> currentPossible = this.possibleResultPoints; Collection<ResultPoint> currentLast = this.lastPossibleResultPoints; if (currentPossible.isEmpty()) { this.lastPossibleResultPoints = null; this.possibleResultPoints = new HashSet(5); this.lastPossibleResultPoints = currentPossible; this.paint.setAlpha(OPAQUE); this.paint.setColor(this.resultPointColor); for (ResultPoint point : currentPossible) { canvas.drawCircle(((float) frame.left) + point.getX(), ((float) frame.top) + point.getY(), 6.0f, this.paint); if (currentLast != null) { this.paint.setAlpha(127); this.paint.setColor(this.resultPointColor); for (ResultPoint point2 : currentLast) { canvas.drawCircle(((float) frame.left) + point2.getX(), ((float) frame.top) + point2.getY(), 3.0f, this.paint); postInvalidateDelayed(ANIMATION_DELAY, frame.left, frame.top, frame.right, frame.bottom); private void drawTextInfo(Canvas canvas, Rect frame) { this.paint.setColor(this.labelTextColor); this.paint.setTextSize(TypedValue.applyDimension(2, this.labelTextSize, getResources().getDisplayMetrics())); this.paint.setTextAlign(Align.CENTER); private void drawCorner(Canvas canvas, Rect frame) { this.paint.setColor(this.cornerColor); canvas.drawRect((float) frame.left, (float) frame.top, (float) (frame.left + CORNER_RECT_WIDTH), (float) (frame.top + CORNER_RECT_HEIGHT), this.paint); canvas.drawRect((float) frame.left, (float) frame.top, (float) (frame.left + CORNER_RECT_HEIGHT), (float) (frame.top + CORNER_RECT_WIDTH), this.paint); canvas.drawRect((float) (frame.right - 8), (float) frame.top, (float) frame.right, (float) (frame.top + CORNER_RECT_HEIGHT), this.paint); canvas.drawRect((float) (frame.right - 40), (float) frame.top, (float) frame.right, (float) (frame.top + CORNER_RECT_WIDTH), this.paint); canvas.drawRect((float) frame.left, (float) (frame.bottom - 8), (float) (frame.left + CORNER_RECT_HEIGHT), (float) frame.bottom, this.paint); canvas.drawRect((float) frame.left, (float) (frame.bottom - 40), (float) (frame.left + CORNER_RECT_WIDTH), (float) frame.bottom, this.paint); canvas.drawRect((float) (frame.right - 8), (float) (frame.bottom - 40), (float) frame.right, (float) frame.bottom, this.paint); canvas.drawRect((float) (frame.right - 40), (float) (frame.bottom - 8), (float) frame.right, (float) frame.bottom, this.paint); private void drawLaserScanner(Canvas canvas, Rect frame) { this.paint.setColor(this.laserColor); LinearGradient linearGradient = new LinearGradient((float) frame.left, (float) scannerStart, (float) frame.left, (float) (scannerStart + 10), shadeColor(this.laserColor), this.laserColor, TileMode.MIRROR); RadialGradient radialGradient = new RadialGradient((float) (frame.left + (frame.width() / 2)), (float) (scannerStart + 5), 360.0f, this.laserColor, shadeColor(this.laserColor), TileMode.MIRROR); SweepGradient sweepGradient = new SweepGradient((float) (frame.left + (frame.width() / 2)), (float) (scannerStart + 10), shadeColor(this.laserColor), this.laserColor); ComposeShader composeShader = new ComposeShader(radialGradient, linearGradient, Mode.ADD); this.paint.setShader(radialGradient); if (scannerStart <= scannerEnd) { canvas.drawOval(new RectF((float) (frame.left + 20), (float) scannerStart, (float) (frame.right - 20), (float) (scannerStart + 10)), this.paint); scannerStart = frame.top; this.paint.setShader(null); public int shadeColor(int color) { return Integer.valueOf("20" + Integer.toHexString(color).substring(2), 16).intValue(); private void drawFrame(Canvas canvas, Rect frame) { this.paint.setColor(this.frameColor); canvas.drawRect((float) frame.left, (float) frame.top, (float) (frame.right + 1), (float) (frame.top + 2), this.paint); canvas.drawRect((float) frame.left, (float) (frame.top + 2), (float) (frame.left + 2), (float) (frame.bottom - 1), this.paint); canvas.drawRect((float) (frame.right - 1), (float) frame.top, (float) (frame.right + 1), (float) (frame.bottom - 1), this.paint); canvas.drawRect((float) frame.left, (float) (frame.bottom - 1), (float) (frame.right + 1), (float) (frame.bottom + 1), this.paint); private void drawExterior(Canvas canvas, Rect frame, int width, int height) { this.paint.setColor(this.resultBitmap != null ? this.resultColor : this.maskColor); canvas.drawRect(0.0f, 0.0f, (float) width, (float) frame.top, this.paint); canvas.drawRect(0.0f, (float) frame.top, (float) frame.left, (float) (frame.bottom + 1), this.paint); canvas.drawRect((float) (frame.right + 1), (float) frame.top, (float) width, (float) (frame.bottom + 1), this.paint); canvas.drawRect(0.0f, (float) (frame.bottom + 1), (float) width, (float) height, this.paint); public void drawViewfinder() { this.resultBitmap = null; public void drawResultBitmap(Bitmap barcode) { this.resultBitmap = barcode; public void addPossibleResultPoint(ResultPoint point) { this.possibleResultPoints.add(point); public void setLabelText(String labelText) { this.labelText = labelText; public void setLabelTextColor(int labelTextColor) { this.labelTextColor = labelTextColor; public void setLabelTextSize(float labelTextSize) { this.labelTextSize = labelTextSize;
FileUtil.java
private static String TAG = "FileUtil"; public static String randomFileName(String ext) { return Long.toString(System.currentTimeMillis()) + ext; public static boolean deleteFile(String name) { File file = new File(name); @SuppressWarnings("deprecation") public static String rotateAndSaveBitmap(File file, int outW, int outH, String outPath, Bitmap.CompressFormat format, int quality) { int orientation = ExifInterface.ORIENTATION_NORMAL; ExifInterface exif = new ExifInterface(file.getAbsolutePath()); orientation = exif.getAttributeInt(ExifInterface.TAG_ORIENTATION, ExifInterface.ORIENTATION_NORMAL); int w = exif.getAttributeInt(ExifInterface.TAG_IMAGE_WIDTH, 0); int h = exif.getAttributeInt(ExifInterface.TAG_IMAGE_LENGTH, 0); case ExifInterface.ORIENTATION_UNDEFINED: case ExifInterface.ORIENTATION_ROTATE_90: case ExifInterface.ORIENTATION_ROTATE_270: // boost decode bitmap performance if (outW <= 0 && outH <= 0) { } else if (outW <= 0 || outH <= 0) { sampleSize = originH / outH; outW = (int) (originW * (outH / (float) originH)); sampleSize = originW / outW; outH = (int) (originH * (outW / (float) originW)); sampleSize = Math.min(originW / outW, originH / outH); BitmapFactory.Options bmOptions = new BitmapFactory.Options(); bmOptions.inJustDecodeBounds = false; bmOptions.inSampleSize = sampleSize; bmOptions.inPurgeable = true; Bitmap bmp = BitmapFactory.decodeFile(file.getAbsolutePath(), bmOptions); Matrix matrix = new Matrix(); case ExifInterface.ORIENTATION_NORMAL: scaleX /= bmp.getWidth(); scaleY /= bmp.getHeight(); matrix.postScale(scaleX, scaleY); case ExifInterface.ORIENTATION_FLIP_HORIZONTAL: scaleX /= bmp.getWidth(); scaleY /= bmp.getHeight(); matrix.postScale(scaleX, scaleY); case ExifInterface.ORIENTATION_ROTATE_180: scaleX /= bmp.getWidth(); scaleY /= bmp.getHeight(); matrix.postScale(scaleX, scaleY); case ExifInterface.ORIENTATION_FLIP_VERTICAL: scaleX /= bmp.getWidth(); scaleY /= bmp.getHeight(); matrix.postScale(scaleX, scaleY); case ExifInterface.ORIENTATION_TRANSPOSE: scaleX /= bmp.getWidth(); scaleY /= bmp.getHeight(); matrix.postScale(scaleX, scaleY); case ExifInterface.ORIENTATION_ROTATE_90: scaleX /= bmp.getHeight(); scaleY /= bmp.getWidth(); matrix.postScale(scaleX, scaleY); case ExifInterface.ORIENTATION_TRANSVERSE: scaleX /= bmp.getWidth(); scaleY /= bmp.getHeight(); matrix.postScale(scaleX, scaleY); case ExifInterface.ORIENTATION_ROTATE_270: scaleX /= bmp.getHeight(); scaleY /= bmp.getWidth(); matrix.postScale(scaleX, scaleY); Bitmap bmpNew = Bitmap.createBitmap(bmp, 0, 0, bmp.getWidth(), bmp.getHeight(), matrix, true); String newFile = bmp2File(bmpNew, outPath, format, quality); private static Point getBitmapSize(String file) { BitmapFactory.Options bmOptions = new BitmapFactory.Options(); bmOptions.inJustDecodeBounds = true; BitmapFactory.decodeFile(file, bmOptions); int originW = bmOptions.outWidth; int originH = bmOptions.outHeight; return new Point(originW, originH); @SuppressWarnings("deprecation") public static String saveBitmap(File f, int outW, int outH, String outPath, Bitmap.CompressFormat format, int quality) { String rotateFile = rotateAndSaveBitmap(f, outW, outH, outPath, format, quality); if (f == null) return ""; String file = f.getAbsolutePath(); // Get the dimensions of the bitmap Point size = getBitmapSize(file); // Determine how much to scale down the image if (outW <= 0 && outH <= 0) { } else if (outW <= 0 || outH <= 0) { scaleFactor = originH / outH; outW = (int) (originW * (outH / (float) originH)); scaleFactor = originW / outW; outH = (int) (originH * (outW / (float) originW)); scaleFactor = Math.min(originW / outW, originH / outH); // Decode the image file into a Bitmap sized to fit out size BitmapFactory.Options bmOptions = new BitmapFactory.Options(); bmOptions.inJustDecodeBounds = false; bmOptions.inSampleSize = scaleFactor; bmOptions.inPurgeable = true; Bitmap bmp = BitmapFactory.decodeFile(file, bmOptions); if (bmp.getWidth() != outW && bmp.getHeight() != outH) { Bitmap bmpNew = Bitmap.createScaledBitmap(bmp, outW, outH, true); newFile = bmp2File(bmpNew, outPath, format, quality); newFile = bmp2File(bmp, outPath, format, quality); public static Point calcScaleSize(String file, int outW, int outH) { ExifInterface exif = new ExifInterface(file); int orientation = exif.getAttributeInt(ExifInterface.TAG_ORIENTATION, ExifInterface.ORIENTATION_NORMAL); int w = exif.getAttributeInt(ExifInterface.TAG_IMAGE_WIDTH, 0); int h = exif.getAttributeInt(ExifInterface.TAG_IMAGE_LENGTH, 0); case ExifInterface.ORIENTATION_UNDEFINED: case ExifInterface.ORIENTATION_ROTATE_90: case ExifInterface.ORIENTATION_ROTATE_270: if (0 == originW || 0 == originH) { Point size = getBitmapSize(file); if (outW <= 0 && outH <= 0) { } else if (outW <= 0 || outH <= 0) { outW = (int) (originW * (outH / (float) originH)); outH = (int) (originH * (outW / (float) originW)); return new Point(outW, outH); public static String bmp2File(Bitmap bmp, String filePath, Bitmap.CompressFormat format, int quality) { String fileName = FileUtil.randomFileName(Bitmap.CompressFormat.PNG == format ? ".png" : ".jpg"); String fullPath = filePath + fileName; File file = new File(fullPath); } catch (IOException e) { FileOutputStream fOut = null; fOut = new FileOutputStream(file); } catch (FileNotFoundException e) { bmp.compress(format, quality, fOut); } catch (IOException e) { public static final String insertImage(ContentResolver cr, String imagePath, String name, String description) throws FileNotFoundException { FileInputStream inStream = new FileInputStream(imagePath); return insertImage(cr, inStream, name, description); * A copy of the Android internals insertImage method, this method populates * the meta data with DATE_ADDED and DATE_TAKEN. This fixes a common problem * where media that is inserted manually gets saved at the end of the * gallery (because date is not populated). * @see Images.Media#insertImage(ContentResolver, * Bitmap, String, String) public static final String insertImage(ContentResolver cr, FileInputStream inStream, String title, String description) { ContentValues values = new ContentValues(); values.put(Images.Media.TITLE, title); values.put(Images.Media.DISPLAY_NAME, title); values.put(Images.Media.DESCRIPTION, description); values.put(Images.Media.MIME_TYPE, "image/jpeg"); // Add the date meta data to ensure the image is added at the front of values.put(Images.Media.DATE_ADDED, System.currentTimeMillis()); values.put(Images.Media.DATE_TAKEN, System.currentTimeMillis()); String stringUrl = null; /* value to be returned */ url = cr.insert(Images.Media.EXTERNAL_CONTENT_URI, values); OutputStream imageOut = cr.openOutputStream(url); byte[] buffer = new byte[1024]; while ((count = inStream.read(buffer)) >= 0) { imageOut.write(buffer, 0, count); long id = ContentUris.parseId(url); // Wait until MINI_KIND thumbnail is generated. Bitmap miniThumb = Images.Thumbnails.getThumbnail(cr, id, Images.Thumbnails.MINI_KIND, null); // This is for backward compatibility. storeThumbnail(cr, miniThumb, id, 50F, 50F, Images.Thumbnails.MICRO_KIND); cr.delete(url, null, null); cr.delete(url, null, null); stringUrl = url.toString(); * A copy of the Android internals StoreThumbnail method, it used with the * insertImage to populate the * android.provider.MediaStore.Images.Media#insertImage with all the correct * meta data. The StoreThumbnail method is private so it must be duplicated * @see Images.Media (StoreThumbnail private private static final Bitmap storeThumbnail(ContentResolver cr, Bitmap source, long id, float width, float height, int kind) { // create the matrix to scale it Matrix matrix = new Matrix(); float scaleX = width / source.getWidth(); float scaleY = height / source.getHeight(); matrix.setScale(scaleX, scaleY); Bitmap thumb = Bitmap.createBitmap(source, 0, 0, source.getWidth(), source.getHeight(), matrix, true); ContentValues values = new ContentValues(4); values.put(Images.Thumbnails.KIND, kind); values.put(Images.Thumbnails.IMAGE_ID, (int) id); values.put(Images.Thumbnails.HEIGHT, thumb.getHeight()); values.put(Images.Thumbnails.WIDTH, thumb.getWidth()); Uri url = cr.insert(Images.Thumbnails.EXTERNAL_CONTENT_URI, values); OutputStream thumbOut = cr.openOutputStream(url); thumb.compress(Bitmap.CompressFormat.JPEG, 100, thumbOut); } catch (FileNotFoundException ex) { } catch (IOException ex) { // ///////////////////////////////////////////////// public static String checkPicturePath(final Context context, final Uri uri) { final boolean isKitKat = Build.VERSION.SDK_INT >= Build.VERSION_CODES.KITKAT; if (isKitKat && DocumentsContract.isDocumentUri(context, uri)) { // ExternalStorageProvider if (isExternalStorageDocument(uri)) { final String docId = DocumentsContract.getDocumentId(uri); final String[] split = docId.split(":"); final String type = split[0]; if ("primary".equalsIgnoreCase(type)) { return Environment.getExternalStorageDirectory() + "/" + split[1]; else if (isDownloadsDocument(uri)) { final String id = DocumentsContract.getDocumentId(uri); final Uri contentUri = ContentUris.withAppendedId(Uri.parse("content://downloads/public_downloads"), Long.valueOf(id)); return getDataColumn(context, contentUri, null, null); else if (isMediaDocument(uri)) { final String docId = DocumentsContract.getDocumentId(uri); final String[] split = docId.split(":"); final String type = split[0]; if ("image".equals(type)) { contentUri = Images.Media.EXTERNAL_CONTENT_URI; } else if ("video".equals(type)) { contentUri = MediaStore.Video.Media.EXTERNAL_CONTENT_URI; } else if ("audio".equals(type)) { contentUri = MediaStore.Audio.Media.EXTERNAL_CONTENT_URI; final String selection = "_id=?"; final String[] selectionArgs = new String[]{split[1]}; return getDataColumn(context, contentUri, selection, selectionArgs); // MediaStore (and general) else if ("content".equalsIgnoreCase(uri.getScheme())) { // Return the remote address if (isGooglePhotosUri(uri)) return uri.getLastPathSegment(); return getDataColumn(context, uri, null, null); else if ("file".equalsIgnoreCase(uri.getScheme())) { * Get the value of the data column for this Uri. This is useful for * MediaStore Uris, and other file-based ContentProviders. * @param context The context. * @param uri The Uri to query. * @param selection (Optional) Filter used in the query. * @param selectionArgs (Optional) Selection arguments used in the query. * @return The value of the _data column, which is typically a file path. private static String getDataColumn(Context context, Uri uri, String selection, String[] selectionArgs) { final String column = "_data"; final String[] projection = {column}; cursor = context.getContentResolver().query(uri, projection, selection, selectionArgs, null); if (cursor != null && cursor.moveToFirst()) { final int index = cursor.getColumnIndexOrThrow(column); return cursor.getString(index); * @param uri The Uri to check. * @return Whether the Uri authority is ExternalStorageProvider. private static boolean isExternalStorageDocument(Uri uri) { return "com.android.externalstorage.documents".equals(uri.getAuthority()); * @param uri The Uri to check. * @return Whether the Uri authority is DownloadsProvider. private static boolean isDownloadsDocument(Uri uri) { return "com.android.providers.downloads.documents".equals(uri.getAuthority()); * @param uri The Uri to check. * @return Whether the Uri authority is MediaProvider. private static boolean isMediaDocument(Uri uri) { return "com.android.providers.media.documents".equals(uri.getAuthority()); * @param uri The Uri to check. * @return Whether the Uri authority is Google Photos. private static boolean isGooglePhotosUri(Uri uri) { return "com.google.android.apps.photos.content".equals(uri.getAuthority());
IMResUtil.java
private static final String TAG = "IMResUtil"; private static IMResUtil instance; private static Class id = null; private static Class layout = null; private static Class style = null; private static Class attr = null; private static Class styleable = null; public IMResUtil(Context paramContext) { this.context = paramContext.getApplicationContext(); layout = Class.forName(this.context.getPackageName() + ".R$layout"); } catch (ClassNotFoundException localClassNotFoundException2) { Log.e(TAG,"localClassNotFoundException2 = " + localClassNotFoundException2.toString()); id = Class.forName(this.context.getPackageName() + ".R$id"); } catch (ClassNotFoundException localClassNotFoundException3) { Log.e(TAG,"localClassNotFoundException3 = " + localClassNotFoundException3.toString()); style = Class.forName(this.context.getPackageName() + ".R$style"); } catch (ClassNotFoundException localClassNotFoundException5) { Log.e(TAG,"localClassNotFoundException5 = " + localClassNotFoundException5.toString()); attr = Class.forName(this.context.getPackageName() + ".R$attr"); } catch (ClassNotFoundException localClassNotFoundException10) { Log.e(TAG,"localClassNotFoundException10 = " + localClassNotFoundException10.toString()); styleable = Class.forName(this.context.getPackageName() + ".R$styleable"); } catch (ClassNotFoundException localClassNotFoundException10) { Log.e(TAG,"localClassNotFoundException10 = " + localClassNotFoundException10.toString()); public static IMResUtil getResofR(Context paramContext) { instance = new IMResUtil(paramContext); public int getId(String paramString) { return getResofR(id, paramString); public int getLayout(String paramString) { return getResofR(layout, paramString); public int getStyle(String paramString) { return getResofR(style, paramString); public int getAttr(String paramString) {return getResofR(attr, paramString);} public int getStyleable(String paramString) {return getResofR(styleable, paramString);} public int [] getStyleableArray(String paramString){ Class clz = Class.forName(context.getPackageName()+".R$styleable"); Field field = clz.getField(paramString); Object object = field.get(clz); int[] attr = (int[]) object; for (int i = 0; i < attr.length; i++) { Log.d(TAG, e.getMessage()); private int getResofR(Class<?> paramClass, String paramString) { if (paramClass == null) { Log.d(TAG,"getRes(null," + paramString + ")"); Log.d(TAG,"Class is null ,ResClass is not initialized."); // throw new IllegalArgumentException("ResClass is not initialized."); Field localField = paramClass.getField(paramString); int k = localField.getInt(paramString); } catch (Exception localException) { Log.e(TAG,"getRes(" + paramClass.getName() + ", " + paramString + ")"); Log.e(TAG,"Error getting resource. Make sure you have copied all resources (res/) from SDK to your project. "); Log.e(TAG,localException.getMessage());
xml <?xml version="1.0" encoding="utf-8"?> <RelativeLayout xmlns:android="http://schemas./apk/res/android" xmlns:app="http://schemas./apk/res-auto" xmlns:tools="http://schemas./tools" android:layout_width="match_parent" android:layout_height="match_parent"> android:layout_width="fill_parent" android:layout_height="50dp" android:background="#000000" android:paddingLeft="6dp" android:paddingStart="6dp"> android:id="@+id/iv_qr_back" android:layout_width="30dp" android:layout_height="24dp" android:layout_centerVertical="true" android:src="@drawable/arrow_back"/> android:id="@+id/tv_qr_title" android:layout_width="wrap_content" android:layout_height="match_parent" android:layout_centerHorizontal="true" android:textColor="#ffffff" android:textSize="20sp"/> android:id="@+id/tv_qr_open_image" android:layout_width="wrap_content" android:layout_height="match_parent" android:layout_alignParentEnd="true" android:layout_alignParentRight="true" android:layout_marginEnd="15dp" android:layout_marginRight="15dp" android:textColor="#ffffff" android:textSize="20sp"/> android:id="@+id/device_qrcode_preview_view" android:layout_width="fill_parent" android:layout_height="fill_parent" android:layout_below="@+id/rl"/> <com.yzq.zxinglibrary.view.ViewfinderView android:id="@+id/vv_qr_viewfinderView" android:layout_width="wrap_content" android:layout_height="wrap_content" android:layout_gravity="center" app:device_qrcode_corner_color="#005699" app:device_qrcode_frame_color="#90FFFFFF" app:device_qrcode_laser_color="#005699" app:device_qrcode_mask_color="#60000000" app:device_qrcode_result_color="#B0000000" app:device_qrcode_result_point_color="#C0FFFF00"/> <!--app:label_text="將二維碼放入框中,即可掃描" app:label_text_color="#ffffff" app:label_text_size="16sp"--> android:id="@+id/bottomLayout" android:layout_width="match_parent" android:layout_height="96dp" android:layout_alignParentBottom="true" android:layout_gravity="bottom" android:background="#99000000" android:orientation="horizontal"> android:id="@+id/flashLightLayout" android:layout_width="0dp" android:layout_height="match_parent" android:layout_weight="1" android:orientation="vertical"> <!--android:background="@drawable/device_qrcode_scan_flash_off"--> android:id="@+id/flashLightIv" android:layout_width="36dp" android:layout_height="36dp" android:scaleType="centerCrop" android:tint="#ffffffff"/> android:id="@+id/flashLightTv" android:layout_width="match_parent" android:layout_height="wrap_content" android:layout_marginTop="5dp" android:textColor="@color/result_text"/> android:id="@+id/albumLayout" android:layout_width="0dp" android:layout_height="match_parent" android:layout_weight="1" android:orientation="vertical"> android:id="@+id/img_phone" android:layout_width="36dp" android:layout_height="36dp" android:scaleType="centerCrop" android:src="@drawable/ic_photo" android:layout_width="match_parent" android:layout_height="wrap_content" android:layout_marginTop="5dp" android:textColor="@color/result_text"/>
清單文件: <?xml version="1.0" encoding="utf-8"?> <manifest xmlns:android="http://schemas./apk/res/android" package="com.xxx.zxinglibrary"> <uses-permission android:name="android.permission.CAMERA" /> <uses-permission android:name="android.permission.FLASHLIGHT" /> <uses-feature android:name="android.hardware.camera" /> <uses-feature android:name="android.hardware.camera.autofocus" /> <uses-permission android:name="android.permission.VIBRATE" /> <uses-permission android:name="android.permission.WAKE_LOCK" /> <uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" /> <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" /> android:name=".decode.CaptureActivity" android:configChanges="orientation|screenSize" android:screenOrientation="portrait" android:theme="@android:style/Theme.NoTitleBar" android:windowSoftInputMode="adjustPan|stateHidden" />
以上的總結(jié)如果能夠幫助到你,希望給個(gè)贊,如果不能,或者有更好的推薦,不管是拍磚,還是怎樣,煩請(qǐng)留言,提供鏈接,改動(dòng)方案,或者提供更好的優(yōu)化建議,謝謝!我們定會(huì)虛心采納,謝謝!
|