从hbase读取图像并使用Opencv检测该图像中的人脸



我需要读取hbase中的图像,并转换为opencvmat进行人脸检测
我的代码如下

public static class FaceCountMapper extends TableMapper<Text, Text> {
private CascadeClassifier faceDetector;
public void setup(Context context) throws IOException, InterruptedException {
if (context.getCacheFiles() != null && context.getCacheFiles().length > 0) {
URI mappingFileUri = context.getCacheFiles()[0];
if (mappingFileUri != null) {
System.out.println(mappingFileUri);
faceDetector = new CascadeClassifier(mappingFileUri.toString());
}
}
super.setup(context);
} // setup()
public ArrayList<Object> detectFaces(Mat image, String file_name) {
ArrayList<Object> facemap = new ArrayList<Object>();
MatOfRect faceDetections = new MatOfRect();
faceDetector.detectMultiScale(image, faceDetections);
System.out.println(String.format("Detected %s faces", faceDetections.toArray().length));
output.put(faceDetections.toArray().length);
facemap.add(output);
}
return facemap;
}
public void map(ImmutableBytesWritable row, Result result, Context context)
throws InterruptedException, IOException {
String file_name = Bytes.toString(result.getValue(Bytes.toBytes("Filename"), Bytes.toBytes("data")));
String mimetype = Bytes.toString(result.getValue(Bytes.toBytes("mime"), Bytes.toBytes("data")));
byte[] image_data = result.getValue(Bytes.toBytes("Data"), Bytes.toBytes("data"));
BufferedImage bi = ImageIO.read(new ByteArrayInputStream(image_data));
Mat mat = new Mat(bi.getHeight(), bi.getWidth(), CvType.CV_8UC3);
mat.put(0, 0, image_data);
detectFaces(mat, file_name);
}

作业配置如下

Configuration conf = this.getConf();
conf.set("hbase.master", "101.192.0.122:16000");
conf.set("hbase.zookeeper.quorum", "101.192.0.122");
conf.setInt("hbase.zookeeper.property.clientPort", 2181);
conf.set("zookeeper.znode.parent", "/hbase-unsecure");
// Initialize and configure MapReduce job
Job job = Job.getInstance(conf);
job.setJarByClass(FaceCount3.class);
job.setMapperClass(FaceCountMapper.class);
job.getConfiguration().set("fs.hdfs.impl", org.apache.hadoop.hdfs.DistributedFileSystem.class.getName());
job.getConfiguration().set("fs.file.impl", org.apache.hadoop.fs.LocalFileSystem.class.getName());
Scan scan = new Scan();
scan.setCaching(500); // 1 is the default in Scan, which will be bad for
// MapReduce jobs
scan.setCacheBlocks(false); // don't set to true for MR jobs
TableMapReduceUtil.initTableMapperJob("Image", // input HBase table name
scan, // Scan instance to control CF and attribute selection
FaceCountMapper.class, // mapper
null, // mapper output key
null, // mapper output value
job);
job.setOutputFormatClass(NullOutputFormat.class); // because we aren't
// emitting anything
// from mapper
job.addCacheFile(new URI("/user/hduser/haarcascade_frontalface_alt.xml"));
job.addFileToClassPath(new Path("/user/hduser/hipi-2.1.0.jar"));
job.addFileToClassPath(new Path("/user/hduser/javacpp.jar"));
DistributedCache.addFileToClassPath(new Path("/user/hduser/haarcascade_frontalface_alt.xml"), conf);
conf.set("mapred.job.tracker", "local");
// Execute the MapReduce job and block until it complets
boolean success = job.waitForCompletion(true);
// Return success or failure
return success ? 0 : 1;

运行时我得到

java.lang.Exception:java.lang.UnsisfiedLinkError:org.opencv.objdetect.CastcadeClassifier.CascadeClassfier_1(Ljava/lang/String;)J

错误。

但是Opencv.jar在hadoop_classpath中提供了

当应用程序尝试加载本机库(如Linux中的.so、Windows上的.dll或Mac中的.dylib),但该库不存在时,会引发不满足链接错误。具体来说,为了找到所需的本机库,JVM同时查找PATH环境变量和java.library.path系统属性。

此外,如果应用程序已经加载了库,并且应用程序试图再次加载它,则JVM将抛出UnsatisfiedLinkError此外,您必须验证本地库是否存在于应用程序的java.library.path或path环境库中。如果仍然找不到库,请尝试提供System.loadLibrary方法的绝对路径。

在您的情况下,请从调用者尝试下面的方法,看看类路径元素是什么。

/**
* Method printClassPathResources.
*/
public static void printClassPathResources() {
final ClassLoader cl = ClassLoader.getSystemClassLoader();
final URL[] urls = ((URLClassLoader) cl).getURLs();
LOG.info("Print All Class path resources under currently running class");
for (final URL url : urls) {
LOG.info(url.getFile());
}
}

基于这些输入,您可以调整类路径条目(在本例中是opencv-jar或其他什么),看看是否有效。

最新更新