使用“where 1=1”时找不到自定义配置单元输入格式



我正在使用配置单元,在使用自定义InputFormat执行查询时遇到异常。

当我使用查询select * from micmiu_blog;时,Hive可以正常工作,但如果我使用select * from micmiu_blog where 1=1;,框架似乎找不到我的自定义InputFormat类。

我已经将JAR文件放入"hive/lib"、"hadoop/lib"中,还将"hadoop/lib"放入CLASSPATH中。这是日志:

hive> select * from micmiu_blog where 1=1;
Total MapReduce jobs = 1
Launching Job 1 out of 1
Number of reduce tasks is set to 0 since there's no reduce operator
Starting Job = job_1415530028127_0004, Tracking URL = http:/ /hadoop01-master:8088/proxy/application_1415530028127_0004/
Kill Command = /home/hduser/hadoop-2.2.0/bin/hadoop job  -kill job_1415530028127_0004
Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 0
2014-11-09 19:53:32,996 Stage-1 map = 0%,  reduce = 0%
2014-11-09 19:53:52,010 Stage-1 map = 100%,  reduce = 0%
Ended Job = job_1415530028127_0004 with errors
Error during job, obtaining debugging information...
Examining task ID: task_1415530028127_0004_m_000000 (and more) from job job_1415530028127_0004
Task with the most failures(4): 
-----
Task ID:
  task_1415530028127_0004_m_000000
URL:
  http://hadoop01-master:8088/taskdetails.jsp?jobid=job_1415530028127_0004&tipid=task_1415530028127_0004_m_000000
-----
Diagnostic Messages for this Task:
Error: java.io.IOException: cannot find class hiveinput.MyDemoInputFormat
        at org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReader(CombineHiveInputFormat.java:564)
        at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.<init>(MapTask.java:167)
        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:408)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)

FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
MapReduce Jobs Launched: 
Job 0: Map: 1   HDFS Read: 0 HDFS Write: 0 FAIL
Total MapReduce CPU Time Spent: 0 msec

我刚才遇到了这个问题。您应该使用hive-cli将JAR文件添加到CLASSPATH中。你可以做到:

hive> add jar /usr/lib/xxx.jar;

相关内容

  • 没有找到相关文章