一,、前言上一篇介紹了 自定義Camera系列之:SurfaceView + Camera,,接著我們介紹使用 為什么選擇 由于
缺點(diǎn)是必須在硬件加速的窗口中使用,且內(nèi)存和耗電比 下面是應(yīng)用的簡(jiǎn)要截圖: 二、相機(jī)開(kāi)發(fā)步驟我們選擇將 Camera 和 View 分開(kāi),,Camera 的相關(guān)操作由 1. 打開(kāi)相機(jī)打開(kāi)相機(jī)需要傳入一個(gè) public void openCamera() { mCamera = Camera.open(mCameraId); // 打開(kāi)相機(jī) Camera.getCameraInfo(mCameraId, mCameraInfo); // 獲取相機(jī)信息 initConfig(); // 初始化相機(jī)配置 setDisplayOrientation(); // 設(shè)置相機(jī)顯示方向 } 2. 初始化相機(jī)配置在舊 Camera API 中,,相機(jī)的配置都是通過(guò) 我們可以設(shè)置 閃光模式,、聚焦模式,、曝光強(qiáng)度、預(yù)覽圖片格式和大小,、拍照?qǐng)D片格式和大小等等信息,。 private void initConfig() { try { mParameters = mCamera.getParameters(); // 如果攝像頭不支持這些參數(shù)都會(huì)出錯(cuò)的,所以設(shè)置的時(shí)候一定要判斷是否支持 List<String> supportedFlashModes = mParameters.getSupportedFlashModes(); if (supportedFlashModes != null && supportedFlashModes.contains(Parameters.FLASH_MODE_OFF)) { mParameters.setFlashMode(Parameters.FLASH_MODE_OFF); // 設(shè)置閃光模式(關(guān)閉) } List<String> supportedFocusModes = mParameters.getSupportedFocusModes(); if (supportedFocusModes != null && supportedFocusModes.contains(Parameters.FOCUS_MODE_AUTO)) { mParameters.setFocusMode(Parameters.FOCUS_MODE_AUTO); // 設(shè)置聚焦模式(自動(dòng)) } mParameters.setPreviewFormat(ImageFormat.NV21); // 設(shè)置預(yù)覽圖片格式 mParameters.setPictureFormat(ImageFormat.JPEG); // 設(shè)置拍照?qǐng)D片格式 mParameters.setExposureCompensation(0); // 設(shè)置曝光強(qiáng)度 Size previewSize = getSuitableSize(mParameters.getSupportedPreviewSizes()); mPreviewWidth = previewSize.width; mPreviewHeight = previewSize.height; mParameters.setPreviewSize(mPreviewWidth, mPreviewHeight); // 設(shè)置預(yù)覽圖片大小 Log.d(TAG, "previewWidth: " + mPreviewWidth + ", previewHeight: " + mPreviewHeight); Size pictureSize = getSuitableSize(mParameters.getSupportedPictureSizes()); mParameters.setPictureSize(pictureSize.width, pictureSize.height); Log.d(TAG, "pictureWidth: " + pictureSize.width + ", pictureHeight: " + pictureSize.height); mCamera.setParameters(mParameters); // 將設(shè)置好的parameters添加到相機(jī)里 } catch (Exception e) { e.printStackTrace(); } } 這里用到的一個(gè) 3. 設(shè)置相機(jī)預(yù)覽時(shí)的顯示方向這個(gè)方法很重要,,關(guān)乎著你預(yù)覽畫(huà)面是否正常。其實(shí)相機(jī)底層的預(yù)覽畫(huà)面全都是寬度大于高度的,,但是豎屏?xí)r畫(huà)面要顯示正常,,都是通過(guò)這個(gè)方法設(shè)置了一定的顯示方向。 private void setDisplayOrientation() { int rotation = mActivity.getWindowManager().getDefaultDisplay().getRotation(); int degrees = 0; switch (rotation) { case Surface.ROTATION_0: degrees = 0; break; case Surface.ROTATION_90: degrees = 90; break; case Surface.ROTATION_180: degrees = 180; break; case Surface.ROTATION_270: degrees = 270; break; } int result; if (mCameraInfo.facing == Camera.CameraInfo.CAMERA_FACING_FRONT) { result = (mCameraInfo.orientation + degrees) % 360; result = (360 - result) % 360; // compensate the mirror } else { // back-facing result = (mCameraInfo.orientation - degrees + 360) % 360; } mCamera.setDisplayOrientation(result); } 4. 開(kāi)始預(yù)覽,、停止預(yù)覽可以通過(guò) public void startPreview(SurfaceTexture surface) { if (mCamera != null) { try { mCamera.setPreviewTexture(surface); // 先綁定顯示的畫(huà)面 } catch (IOException e) { e.printStackTrace(); } mCamera.startPreview(); // 這里才是開(kāi)始預(yù)覽 } } public void stopPreview() { if (mCamera != null) { mCamera.stopPreview(); // 停止預(yù)覽 } } 5. 釋放相機(jī)相機(jī)是很耗費(fèi)系統(tǒng)資源的東西,,用完一定要釋放,。對(duì)應(yīng)于 public void releaseCamera() { if (mCamera != null) { mCamera.setPreviewCallback(null); mCamera.stopPreview(); mCamera.release(); mCamera = null; } } 6. 點(diǎn)擊聚焦簡(jiǎn)單的說(shuō),,就是根據(jù)用戶在 view 上的觸摸點(diǎn),,使相機(jī)對(duì)該點(diǎn)進(jìn)行一次對(duì)焦操作。 詳細(xì)看我的這篇博客介紹的把:Android自定義相機(jī)定點(diǎn)聚焦 完整代碼后面會(huì)貼的,。 7. 雙指放大縮小我們也只實(shí)現(xiàn)放大縮小的邏輯,,至于 View 的觸摸交給 View 類去完成。 public void handleZoom(boolean isZoomIn) { if (mParameters.isZoomSupported()) { // 首先還是要判斷是否支持 int maxZoom = mParameters.getMaxZoom(); int zoom = mParameters.getZoom(); if (isZoomIn && zoom < maxZoom) { zoom++; } else if (zoom > 0) { zoom--; } mParameters.setZoom(zoom); // 通過(guò)這個(gè)方法設(shè)置放大縮小 mCamera.setParameters(mParameters); } else { Log.w(TAG, "zoom not supported"); } } 8. 拍照拍照的邏輯我交給上層去完成了,,后面再詳細(xì)介紹把,。這里我們只是簡(jiǎn)單的封裝了一下元接口,一般常用的是 public void takePicture(Camera.PictureCallback pictureCallback) { mCamera.takePicture(null, null, pictureCallback); } 9. 其它諸如設(shè)置預(yù)覽回調(diào)、切換前后攝像頭的操作等直接看下面的實(shí)現(xiàn)把,。另外對(duì)于 聚焦模式,、閃光燈模式等沒(méi)有詳細(xì)去介紹了,,感興趣的可以另外搜索相關(guān)模塊。畢竟相機(jī)要介紹完全的話還是一塊很大的東西,。 10. CameraProxy 類下面代碼還用到了 package com.afei.camerademo.camera; import android.app.Activity; import android.graphics.ImageFormat; import android.graphics.Rect; import android.graphics.SurfaceTexture; import android.hardware.Camera; import android.hardware.Camera.CameraInfo; import android.hardware.Camera.Parameters; import android.hardware.Camera.PreviewCallback; import android.hardware.Camera.Size; import android.util.Log; import android.view.OrientationEventListener; import android.view.Surface; import android.view.SurfaceHolder; import java.io.IOException; import java.util.ArrayList; import java.util.List; @SuppressWarnings("deprecation") public class CameraProxy implements Camera.AutoFocusCallback { private static final String TAG = "CameraProxy"; private Activity mActivity; private Camera mCamera; private Parameters mParameters; private CameraInfo mCameraInfo = new CameraInfo(); private int mCameraId = CameraInfo.CAMERA_FACING_BACK; private int mPreviewWidth = 1440; // default 1440 private int mPreviewHeight = 1080; // default 1080 private float mPreviewScale = mPreviewHeight * 1f / mPreviewWidth; private PreviewCallback mPreviewCallback; // 相機(jī)預(yù)覽的數(shù)據(jù)回調(diào) private OrientationEventListener mOrientationEventListener; private int mLatestRotation = 0; public byte[] mPreviewBuffer; public CameraProxy(Activity activity) { mActivity = activity; mOrientationEventListener = new OrientationEventListener(mActivity) { @Override public void onOrientationChanged(int orientation) { Log.d(TAG, "onOrientationChanged: orientation: " + orientation); setPictureRotate(orientation); } }; } public void openCamera() { Log.d(TAG, "openCamera cameraId: " + mCameraId); mCamera = Camera.open(mCameraId); Camera.getCameraInfo(mCameraId, mCameraInfo); initConfig(); setDisplayOrientation(); Log.d(TAG, "openCamera enable mOrientationEventListener"); mOrientationEventListener.enable(); } public void releaseCamera() { if (mCamera != null) { Log.v(TAG, "releaseCamera"); mCamera.setPreviewCallback(null); mCamera.stopPreview(); mCamera.release(); mCamera = null; } mOrientationEventListener.disable(); } public void startPreview(SurfaceHolder holder) { if (mCamera != null) { Log.v(TAG, "startPreview"); try { mCamera.setPreviewDisplay(holder); } catch (IOException e) { e.printStackTrace(); } mCamera.startPreview(); } } public void startPreview(SurfaceTexture surface) { if (mCamera != null) { Log.v(TAG, "startPreview"); try { mCamera.setPreviewTexture(surface); } catch (IOException e) { e.printStackTrace(); } mCamera.startPreview(); } } public void stopPreview() { if (mCamera != null) { Log.v(TAG, "stopPreview"); mCamera.stopPreview(); } } public boolean isFrontCamera() { return mCameraInfo.facing == Camera.CameraInfo.CAMERA_FACING_FRONT; } private void initConfig() { Log.v(TAG, "initConfig"); try { mParameters = mCamera.getParameters(); // 如果攝像頭不支持這些參數(shù)都會(huì)出錯(cuò)的,,所以設(shè)置的時(shí)候一定要判斷是否支持 List<String> supportedFlashModes = mParameters.getSupportedFlashModes(); if (supportedFlashModes != null && supportedFlashModes.contains(Parameters.FLASH_MODE_OFF)) { mParameters.setFlashMode(Parameters.FLASH_MODE_OFF); // 設(shè)置閃光模式 } List<String> supportedFocusModes = mParameters.getSupportedFocusModes(); if (supportedFocusModes != null && supportedFocusModes.contains(Parameters.FOCUS_MODE_AUTO)) { mParameters.setFocusMode(Parameters.FOCUS_MODE_AUTO); // 設(shè)置聚焦模式 } mParameters.setPreviewFormat(ImageFormat.NV21); // 設(shè)置預(yù)覽圖片格式 mParameters.setPictureFormat(ImageFormat.JPEG); // 設(shè)置拍照?qǐng)D片格式 mParameters.setExposureCompensation(0); // 設(shè)置曝光強(qiáng)度 Size previewSize = getSuitableSize(mParameters.getSupportedPreviewSizes()); mPreviewWidth = previewSize.width; mPreviewHeight = previewSize.height; mParameters.setPreviewSize(mPreviewWidth, mPreviewHeight); // 設(shè)置預(yù)覽圖片大小 Log.d(TAG, "previewWidth: " + mPreviewWidth + ", previewHeight: " + mPreviewHeight); Size pictureSize = getSuitableSize(mParameters.getSupportedPictureSizes()); mParameters.setPictureSize(pictureSize.width, pictureSize.height); Log.d(TAG, "pictureWidth: " + pictureSize.width + ", pictureHeight: " + pictureSize.height); mCamera.setParameters(mParameters); // 將設(shè)置好的parameters添加到相機(jī)里 } catch (Exception e) { e.printStackTrace(); } } private Size getSuitableSize(List<Size> sizes) { int minDelta = Integer.MAX_VALUE; // 最小的差值,,初始值應(yīng)該設(shè)置大點(diǎn)保證之后的計(jì)算中會(huì)被重置 int index = 0; // 最小的差值對(duì)應(yīng)的索引坐標(biāo) for (int i = 0; i < sizes.size(); i++) { Size previewSize = sizes.get(i); Log.v(TAG, "SupportedPreviewSize, width: " + previewSize.width + ", height: " + previewSize.height); // 找到一個(gè)與設(shè)置的分辨率差值最小的相機(jī)支持的分辨率大小 if (previewSize.width * mPreviewScale == previewSize.height) { int delta = Math.abs(mPreviewWidth - previewSize.width); if (delta == 0) { return previewSize; } if (minDelta > delta) { minDelta = delta; index = i; } } } return sizes.get(index); // 默認(rèn)返回與設(shè)置的分辨率最接近的預(yù)覽尺寸 } /** * 設(shè)置相機(jī)顯示的方向,必須設(shè)置,,否則顯示的圖像方向會(huì)錯(cuò)誤 */ private void setDisplayOrientation() { int rotation = mActivity.getWindowManager().getDefaultDisplay().getRotation(); int degrees = 0; switch (rotation) { case Surface.ROTATION_0: degrees = 0; break; case Surface.ROTATION_90: degrees = 90; break; case Surface.ROTATION_180: degrees = 180; break; case Surface.ROTATION_270: degrees = 270; break; } int result; if (mCameraInfo.facing == Camera.CameraInfo.CAMERA_FACING_FRONT) { result = (mCameraInfo.orientation + degrees) % 360; result = (360 - result) % 360; // compensate the mirror } else { // back-facing result = (mCameraInfo.orientation - degrees + 360) % 360; } mCamera.setDisplayOrientation(result); } private void setPictureRotate(int orientation) { if (orientation == OrientationEventListener.ORIENTATION_UNKNOWN) return; orientation = (orientation + 45) / 90 * 90; int rotation = 0; if (mCameraInfo.facing == CameraInfo.CAMERA_FACING_FRONT) { rotation = (mCameraInfo.orientation - orientation + 360) % 360; } else { // back-facing camera rotation = (mCameraInfo.orientation + orientation) % 360; } Log.d(TAG, "picture rotation: " + rotation); mLatestRotation = rotation; } public int getLatestRotation() { return mLatestRotation; } public void setPreviewCallback(PreviewCallback previewCallback) { mPreviewCallback = previewCallback; if (mPreviewBuffer == null) { mPreviewBuffer = new byte[mPreviewWidth * mPreviewHeight * 3 / 2]; } mCamera.addCallbackBuffer(mPreviewBuffer); mCamera.setPreviewCallbackWithBuffer(mPreviewCallback); // 設(shè)置預(yù)覽的回調(diào) } public void takePicture(Camera.PictureCallback pictureCallback) { mCamera.takePicture(null, null, pictureCallback); } public void switchCamera() { mCameraId ^= 1; // 先改變攝像頭朝向 releaseCamera(); openCamera(); } public void focusOnPoint(int x, int y, int width, int height) { Log.v(TAG, "touch point (" + x + ", " + y + ")"); if (mCamera == null) { return; } Parameters parameters = mCamera.getParameters(); // 1.先要判斷是否支持設(shè)置聚焦區(qū)域 if (parameters.getMaxNumFocusAreas() > 0) { // 2.以觸摸點(diǎn)為中心點(diǎn),,view窄邊的1/4為聚焦區(qū)域的默認(rèn)邊長(zhǎng) int length = Math.min(width, height) >> 3; // 1/8的長(zhǎng)度 int left = x - length; int top = y - length; int right = x + length; int bottom = y + length; // 3.映射,因?yàn)橄鄼C(jī)聚焦的區(qū)域是一個(gè)(-1000,-1000)到(1000,1000)的坐標(biāo)區(qū)域 left = left * 2000 / width - 1000; top = top * 2000 / height - 1000; right = right * 2000 / width - 1000; bottom = bottom * 2000 / height - 1000; // 4.判斷上述矩形區(qū)域是否超過(guò)邊界,,若超過(guò)則設(shè)置為臨界值 left = left < -1000 ? -1000 : left; top = top < -1000 ? -1000 : top; right = right > 1000 ? 1000 : right; bottom = bottom > 1000 ? 1000 : bottom; Log.d(TAG, "focus area (" + left + ", " + top + ", " + right + ", " + bottom + ")"); ArrayList<Camera.Area> areas = new ArrayList<>(); areas.add(new Camera.Area(new Rect(left, top, right, bottom), 600)); parameters.setFocusAreas(areas); } try { mCamera.cancelAutoFocus(); // 先要取消掉進(jìn)程中所有的聚焦功能 mCamera.setParameters(parameters); mCamera.autoFocus(this); // 調(diào)用聚焦 } catch (Exception e) { e.printStackTrace(); } } public void handleZoom(boolean isZoomIn) { if (mParameters.isZoomSupported()) { int maxZoom = mParameters.getMaxZoom(); int zoom = mParameters.getZoom(); if (isZoomIn && zoom < maxZoom) { zoom++; } else if (zoom > 0) { zoom--; } Log.d(TAG, "handleZoom: zoom: " + zoom); mParameters.setZoom(zoom); mCamera.setParameters(mParameters); } else { Log.i(TAG, "zoom not supported"); } } public Camera getCamera() { return mCamera; } public int getPreviewWidth() { return mPreviewWidth; } public int getPreviewHeight() { return mPreviewHeight; } @Override public void onAutoFocus(boolean success, Camera camera) { Log.d(TAG, "onAutoFocus: " + success); } } 三,、CameraTextureView通過(guò)上面的介紹,對(duì)于相機(jī)的操作應(yīng)該有了一定的了解了,,接下來(lái)完成 View 這部分。 需求分析:
實(shí)現(xiàn):主要是在 package com.afei.camerademo.textureview; import android.app.Activity; import android.content.Context; import android.graphics.SurfaceTexture; import android.util.AttributeSet; import android.view.MotionEvent; import android.view.TextureView; import com.afei.camerademo.camera.CameraProxy; public class CameraTextureView extends TextureView { private static final String TAG = "CameraTextureView"; private CameraProxy mCameraProxy; private int mRatioWidth = 0; private int mRatioHeight = 0; private float mOldDistance; public CameraTextureView(Context context) { this(context, null); } public CameraTextureView(Context context, AttributeSet attrs) { this(context, attrs, 0); } public CameraTextureView(Context context, AttributeSet attrs, int defStyleAttr) { this(context, attrs, defStyleAttr, 0); } public CameraTextureView(Context context, AttributeSet attrs, int defStyleAttr, int defStyleRes) { super(context, attrs, defStyleAttr, defStyleRes); init(context); } private void init(Context context) { setSurfaceTextureListener(mSurfaceTextureListener); mCameraProxy = new CameraProxy((Activity) context); } private SurfaceTextureListener mSurfaceTextureListener = new SurfaceTextureListener() { @Override public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) { mCameraProxy.openCamera(); int previewWidth = mCameraProxy.getPreviewWidth(); int previewHeight = mCameraProxy.getPreviewHeight(); if (width > height) { setAspectRatio(previewWidth, previewHeight); } else { setAspectRatio(previewHeight, previewWidth); } mCameraProxy.startPreview(surface); } @Override public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height) { } @Override public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) { mCameraProxy.releaseCamera(); return false; } @Override public void onSurfaceTextureUpdated(SurfaceTexture surface) { } }; public void setAspectRatio(int width, int height) { if (width < 0 || height < 0) { throw new IllegalArgumentException("Size cannot be negative."); } mRatioWidth = width; mRatioHeight = height; requestLayout(); } public CameraProxy getCameraProxy() { return mCameraProxy; } @Override protected void onMeasure(int widthMeasureSpec, int heightMeasureSpec) { super.onMeasure(widthMeasureSpec, heightMeasureSpec); int width = MeasureSpec.getSize(widthMeasureSpec); int height = MeasureSpec.getSize(heightMeasureSpec); if (0 == mRatioWidth || 0 == mRatioHeight) { setMeasuredDimension(width, height); } else { if (width < height * mRatioWidth / mRatioHeight) { setMeasuredDimension(width, width * mRatioHeight / mRatioWidth); } else { setMeasuredDimension(height * mRatioWidth / mRatioHeight, height); } } } @Override public boolean onTouchEvent(MotionEvent event) { if (event.getPointerCount() == 1) { // 點(diǎn)擊聚焦 mCameraProxy.focusOnPoint((int) event.getX(), (int) event.getY(), getWidth(), getHeight()); return true; } switch (event.getAction() & MotionEvent.ACTION_MASK) { case MotionEvent.ACTION_POINTER_DOWN: mOldDistance = getFingerSpacing(event); break; case MotionEvent.ACTION_MOVE: float newDistance = getFingerSpacing(event); if (newDistance > mOldDistance) { mCameraProxy.handleZoom(true); } else if (newDistance < mOldDistance) { mCameraProxy.handleZoom(false); } mOldDistance = newDistance; break; default: break; } return super.onTouchEvent(event); } private static float getFingerSpacing(MotionEvent event) { float x = event.getX(0) - event.getX(1); float y = event.getY(0) - event.getY(1); return (float) Math.sqrt(x * x + y * y); } } 四、CameraActivity接下來(lái),,我們把寫(xiě)好的 注意相機(jī)使用前,需要申請(qǐng)相關(guān)權(quán)限,,以及權(quán)限的動(dòng)態(tài)申請(qǐng),。 1. AndroidManifest.xml相機(jī)相關(guān)權(quán)限如下,,動(dòng)態(tài)權(quán)限的申請(qǐng)代碼很多,這里不詳細(xì)介紹了,,不清楚的可以看這篇博客:Android動(dòng)態(tài)權(quán)限申請(qǐng) <uses-permission android:name="android.permission.CAMERA"/> <uses-feature android:name="android.hardware.camera"/> <uses-feature android:name="android.hardware.camera.autofocus"/> 2. 拍照功能之前只是預(yù)留了拍照的功能,,并等待外面出傳入一個(gè)接口回調(diào),這里我們?cè)谂恼真I點(diǎn)下的時(shí)候執(zhí)行拍照操作就行,。 當(dāng)拍照完成時(shí),,會(huì)自動(dòng)調(diào)用 TextureCameraActivity 完整代碼: package com.afei.camerademo.textureview; import android.content.Intent; import android.graphics.Bitmap; import android.graphics.BitmapFactory; import android.hardware.Camera; import android.os.AsyncTask; import android.os.Bundle; import android.provider.MediaStore; import android.support.v7.app.AppCompatActivity; import android.util.Log; import android.view.View; import android.widget.ImageView; import com.afei.camerademo.ImageUtils; import com.afei.camerademo.R; import com.afei.camerademo.camera.CameraProxy; public class TextureCameraActivity extends AppCompatActivity implements View.OnClickListener { private static final String TAG = "TextureCameraActivity"; private ImageView mCloseIv; private ImageView mSwitchCameraIv; private ImageView mTakePictureIv; private ImageView mPictureIv; private CameraTextureView mCameraView; private CameraProxy mCameraProxy; @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_texture_camera); initView(); } private void initView() { mCloseIv = findViewById(R.id.toolbar_close_iv); mCloseIv.setOnClickListener(this); mSwitchCameraIv = findViewById(R.id.toolbar_switch_iv); mSwitchCameraIv.setOnClickListener(this); mTakePictureIv = findViewById(R.id.take_picture_iv); mTakePictureIv.setOnClickListener(this); mPictureIv = findViewById(R.id.picture_iv); mPictureIv.setOnClickListener(this); mPictureIv.setImageBitmap(ImageUtils.getLatestThumbBitmap()); mCameraView = findViewById(R.id.camera_view); mCameraProxy = mCameraView.getCameraProxy(); } @Override public void onClick(View v) { switch (v.getId()) { case R.id.toolbar_close_iv: finish(); break; case R.id.toolbar_switch_iv: mCameraProxy.switchCamera(); mCameraProxy.startPreview(mCameraView.getSurfaceTexture()); break; case R.id.take_picture_iv: mCameraProxy.takePicture(mPictureCallback); break; case R.id.picture_iv: Intent intent = new Intent(Intent.ACTION_PICK, MediaStore.Images.Media.EXTERNAL_CONTENT_URI); startActivity(intent); break; } } private final Camera.PictureCallback mPictureCallback = new Camera.PictureCallback() { @Override public void onPictureTaken(byte[] data, Camera camera) { mCameraProxy.startPreview(mCameraView.getSurfaceTexture()); // 拍照結(jié)束后繼續(xù)預(yù)覽 new ImageSaveTask().execute(data); // 保存圖片 } }; private class ImageSaveTask extends AsyncTask<byte[], Void, Void> { @Override protected Void doInBackground(byte[]... bytes) { long time = System.currentTimeMillis(); Bitmap bitmap = BitmapFactory.decodeByteArray(bytes[0], 0, bytes[0].length); Log.d(TAG, "BitmapFactory.decodeByteArray time: " + (System.currentTimeMillis() - time)); int rotation = mCameraProxy.getLatestRotation(); time = System.currentTimeMillis(); Bitmap rotateBitmap = ImageUtils.rotateBitmap(bitmap, rotation, mCameraProxy.isFrontCamera(), true); Log.d(TAG, "rotateBitmap time: " + (System.currentTimeMillis() - time)); time = System.currentTimeMillis(); ImageUtils.saveBitmap(rotateBitmap); Log.d(TAG, "saveBitmap time: " + (System.currentTimeMillis() - time)); return null; } @Override protected void onPostExecute(Void aVoid) { mPictureIv.setImageBitmap(ImageUtils.getLatestThumbBitmap()); } } } 另外,,我又嘗試過(guò)通過(guò) 最后還是選擇使用手動(dòng)旋轉(zhuǎn) Bitmap 的方式來(lái)完成圖片的旋轉(zhuǎn)并保存,。 附上 package com.afei.camerademo; import android.content.ContentResolver; import android.content.ContentValues; import android.content.Context; import android.database.Cursor; import android.graphics.Bitmap; import android.graphics.Matrix; import android.os.Environment; import android.provider.MediaStore; import android.util.Log; import java.io.File; import java.io.FileOutputStream; import java.io.IOException; import java.text.SimpleDateFormat; import java.util.Date; public class ImageUtils { private static final String TAG = "ImageUtils"; private static final String GALLERY_PATH = Environment.getExternalStoragePublicDirectory(Environment .DIRECTORY_DCIM) + File.separator + "Camera"; private static final SimpleDateFormat DATE_FORMAT = new SimpleDateFormat("yyyyMMdd_HHmmss"); public static Bitmap rotateBitmap(Bitmap source, int degree, boolean flipHorizontal, boolean recycle) { if (degree == 0) { return source; } Matrix matrix = new Matrix(); matrix.postRotate(degree); if (flipHorizontal) { matrix.postScale(-1, 1); // 前置攝像頭存在水平鏡像的問(wèn)題,所以有需要的話調(diào)用這個(gè)方法進(jìn)行水平鏡像 } Bitmap rotateBitmap = Bitmap.createBitmap(source, 0, 0, source.getWidth(), source.getHeight(), matrix, false); if (recycle) { source.recycle(); } return rotateBitmap; } public static void saveBitmap(Bitmap bitmap) { String fileName = DATE_FORMAT.format(new Date(System.currentTimeMillis())) + ".jpg"; File outFile = new File(GALLERY_PATH, fileName); Log.d(TAG, "saveImage. filepath: " + outFile.getAbsolutePath()); FileOutputStream os = null; try { os = new FileOutputStream(outFile); boolean success = bitmap.compress(Bitmap.CompressFormat.JPEG, 100, os); if (success) { insertToDB(outFile.getAbsolutePath()); } } catch (IOException e) { e.printStackTrace(); } finally { if (os != null) { try { os.close(); } catch (IOException e) { e.printStackTrace(); } } } } public static void insertToDB(String picturePath) { ContentValues values = new ContentValues(); ContentResolver resolver = MyApp.getInstance().getContentResolver(); values.put(MediaStore.Images.ImageColumns.DATA, picturePath); values.put(MediaStore.Images.ImageColumns.TITLE, picturePath.substring(picturePath.lastIndexOf("/") + 1)); values.put(MediaStore.Images.ImageColumns.DATE_TAKEN, System.currentTimeMillis()); values.put(MediaStore.Images.ImageColumns.MIME_TYPE, "image/jpeg"); resolver.insert(MediaStore.Images.Media.EXTERNAL_CONTENT_URI, values); } } 五,、項(xiàng)目地址部分沒(méi)有貼出來(lái)的代碼,,可在下面地址中找到。 地址: https://github.com/afei-cn/CameraDemo/tree/master/app/src/main/java/com/afei/camerademo/textureview 其它: 自定義Camera系列之:SurfaceView + Camera 自定義Camera系列之:GLSurfaceViewView + Camera 自定義Camera系列之:SurfaceView + Camera2 自定義Camera系列之:TextureView + Camera2 自定義Camera系列之:GLSurfaceView + Camera2 |
|
來(lái)自: 創(chuàng)始元靈6666 > 《work》