Android开发 API人脸检测实例教程(内含源码)

Android中文API最新中文版
http://www.eoeandroid.com/thread-58597-1-1.html
=============帖子正文=======================

经过两个主要的API,Android提供了一个直接在位图上进行脸部检测的方法,这两个API分别是    android.media.FaceDetector和android.media.FaceDetector.Face,已经包含在Android官方API中。本教程来自Developer网站,向你们介绍了这些API,同时提供教程中实例代码下载
 
102_120427155144_1_lit.jpg 
所谓人脸检测就是指从一副图片或者一帧视频中标定出全部人脸的位置和尺寸。人脸检测是人脸识别系统中的一个重要环节,也能够独立应用于视频监控。在数字媒体日益普及的今天,利用人脸检测技术还能够帮助咱们从海量图片数据中快速筛选出包含人脸的图片。 在目前的数码相机中,人脸检测能够用来完成自动对焦,即“脸部对焦”。“脸部对焦”是在自动曝光和自动对焦发明后,二十年来最重要的一次摄影技术革新。家用数码相机,占绝大多数的照片是以人为拍摄主体的,这就要求相机的自动曝光和对焦以人物为基准。

构建一我的脸检测的Android Activity

你能够构建一个通用的Android Activity,咱们扩展了基类ImageView,成为MyImageView,而咱们须要进行检测的包含人脸的位图文件必须是565格式,API才能正常工做。被检测出来的人脸须要一个置信测度(confidence measure),这个措施定义在android.media.FaceDetector.Face.CONFIDENCE_THRESHOLD。
最重要的方法实如今setFace(),它将FaceDetector对象实例化,同时调用findFaces,结果存放在faces里,人脸的中点转移到MyImageView。代码以下:

?
代码片断,双击复制
01
02
03
04
05
06
07
08
09
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
public class TutorialOnFaceDetect1 extends Activity {
private MyImageView mIV;
private Bitmap mFaceBitmap;
private int mFaceWidth = 200 ;
private int mFaceHeight = 200 ;
private static final int MAX_FACES = 1 ;
private static String TAG = "TutorialOnFaceDetect" ;
 
@Override
public void onCreate(Bundle savedInstanceState) {
super .onCreate(savedInstanceState);
 
mIV = new MyImageView( this );
setContentView(mIV, new LayoutParams(LayoutParams.WRAP_CONTENT, LayoutParams.WRAP_CONTENT));
 
// load the photo
Bitmap b = BitmapFactory.decodeResource(getResources(), R.drawable.face3);
mFaceBitmap = b.copy(Bitmap.Config.RGB_565, true );
b.recycle();
 
mFaceWidth = mFaceBitmap.getWidth();
mFaceHeight = mFaceBitmap.getHeight();
mIV.setImageBitmap(mFaceBitmap);
 
// perform face detection and set the feature points setFace();
 
mIV.invalidate();
}
 
public void setFace() {
FaceDetector fd;
FaceDetector.Face [] faces = new FaceDetector.Face[MAX_FACES];
PointF midpoint = new PointF();
int [] fpx = null ;
int [] fpy = null ;
int count = 0 ;
 
try {
fd = new FaceDetector(mFaceWidth, mFaceHeight, MAX_FACES);
count = fd.findFaces(mFaceBitmap, faces);
} catch (Exception e) {
Log.e(TAG, "setFace(): " + e.toString());
return ;
}
 
// check if we detect any faces
if (count > 0 ) {
fpx = new int [count];
fpy = new int [count];
 
for ( int i = 0 ; i < count; i++) {
try {
faces<i>.getMidPoint(midpoint);
 
fpx = ( int )midpoint.x;
fpy = ( int )midpoint.y;
} catch (Exception e) {
Log.e(TAG, "setFace(): face " + i + ": " + e.toString());
}
}
}
 
mIV.setDisplayPoints(fpx, fpy, count, 0 );
}
} </i>


接下来的代码中,咱们在MyImageView中添加setDisplayPoints() ,用来在被检测出的人脸上标记渲染。图1展现了一个标记在被检测处的人脸上处于中心位置。
?
代码片断,双击复制
01
02
03
04
05
06
07
08
09
10
11
12
13
14
15
16
// set up detected face features for display
public void setDisplayPoints( int [] xx, int [] yy, int total, int style) {
mDisplayStyle = style;
mPX = null ;
mPY = null ;
 
if (xx != null && yy != null && total > 0 ) {
mPX = new int [total];
mPY = new int [total];
 
for ( int i = 0 ; i < total; i++) {
mPX = xx;
mPY = yy;
}
}
}


多人脸检测

经过FaceDetector能够设定检测到人脸数目的上限。好比设置最多只检测10张脸:
  1. private static final int MAX_FACES = 10; 
复制代码
图2展现检测到多张人脸的状况。
102_120427154908_1.jpg 
定位眼睛中心位置

Android人脸检测返回其余有用的信息,例同时会返回如eyesDistance,pose,以及confidence。咱们能够经过eyesDistance来定位眼睛的中心位置。

下面的代码中,咱们将setFace()放在doLengthyCalc()中。同时图3展现了定位眼睛中心位置的效果。
?
代码片断,双击复制
01
02
03
04
05
06
07
08
09
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
public class TutorialOnFaceDetect extends Activity {
private MyImageView mIV;
private Bitmap mFaceBitmap;
private int mFaceWidth = 200 ;
private int mFaceHeight = 200 ;
private static final int MAX_FACES = 10 ;
private static String TAG = "TutorialOnFaceDetect" ;
private static boolean DEBUG = false ;
 
protected static final int GUIUPDATE_SETFACE = 999 ;
protected Handler mHandler = new Handler(){
// @Override
public void handleMessage(Message msg) {
mIV.invalidate();
 
super .handleMessage(msg);
}
};
 
@Override
public void onCreate(Bundle savedInstanceState) {
super .onCreate(savedInstanceState);
 
mIV = new MyImageView( this );
setContentView(mIV, new LayoutParams(LayoutParams.WRAP_CONTENT, LayoutParams.WRAP_CONTENT));
 
// load the photo
Bitmap b = BitmapFactory.decodeResource(getResources(), R.drawable.face3);
mFaceBitmap = b.copy(Bitmap.Config.RGB_565, true );
b.recycle();
 
mFaceWidth = mFaceBitmap.getWidth();
mFaceHeight = mFaceBitmap.getHeight();
mIV.setImageBitmap(mFaceBitmap);
mIV.invalidate();
 
// perform face detection in setFace() in a background thread
doLengthyCalc();
}
 
public void setFace() {
FaceDetector fd;
FaceDetector.Face [] faces = new FaceDetector.Face[MAX_FACES];
PointF eyescenter = new PointF();
float eyesdist = 0 .0f;
int [] fpx = null ;
int [] fpy = null ;
int count = 0 ;
 
try {
fd = new FaceDetector(mFaceWidth, mFaceHeight, MAX_FACES);
count = fd.findFaces(mFaceBitmap, faces);
} catch (Exception e) {
Log.e(TAG, "setFace(): " + e.toString());
return ;
}
 
// check if we detect any faces
if (count > 0 ) {
fpx = new int [count * 2 ];
fpy = new int [count * 2 ];
 
for ( int i = 0 ; i < count; i++) {
try {
faces<i>.getMidPoint(eyescenter);
eyesdist = faces<i>.eyesDistance();
 
// set up left eye location
fpx[ 2 * i] = ( int )(eyescenter.x - eyesdist / 2 );
fpy[ 2 * i] = ( int )eyescenter.y;
 
// set up right eye location
fpx[ 2 * i + 1 ] = ( int )(eyescenter.x + eyesdist / 2 );
fpy[ 2 * i + 1 ] = ( int )eyescenter.y;
 
if (DEBUG) {
Log.e(TAG, "setFace(): face " + i + ": confidence = " + faces<i>.confidence()
+ ", eyes distance = " + faces<i>.eyesDistance()
+ ", pose = (" + faces<i>.pose(FaceDetector.Face.EULER_X) + ","
+ faces<i>.pose(FaceDetector.Face.EULER_Y) + ","
+ faces<i>.pose(FaceDetector.Face.EULER_Z) + ")"
+ ", eyes midpoint = (" + eyescenter.x + "," + eyescenter.y + ")" );
}
} catch (Exception e) {
Log.e(TAG, "setFace(): face " + i + ": " + e.toString());
}
}
}
 
mIV.setDisplayPoints(fpx, fpy, count * 2 , 1 );
}
 
private void doLengthyCalc() {
Thread t = new Thread() {
Message m = new Message();
 
public void run() {
try {
setFace();
m.what = TutorialOnFaceDetect.GUIUPDATE_SETFACE;
TutorialOnFaceDetect. this .mHandler.sendMessage(m);
} catch (Exception e) {
Log.e(TAG, "doLengthyCalc(): " + e.toString());
}
}
};
 
t.start();
}
}
</i></i></i></i></i></i></i>
相关文章
相关标签/搜索