Phoenix-hbase 无法通过 jdbc 执行 upsert 查询



首先,我想澄清一下创建查询工作正常。

执行以下操作时:查询 1 和查询 3 在执行查询 3 时出错,但查询 2 和 3 工作正常。我在网上找不到任何通字裤。

class test {
public static void main(String args[]) {
Connection connection;
try {
Class.forName("org.apache.phoenix.jdbc.PhoenixDriver");
} catch (ClassNotFoundException e) {
e.printStackTrace();
}
connection = DriverManager.getConnection("jdbc:phoenix:localhost:2181/hbase-insecure");
//(note:i have tried without /hbase - insecure the result is same)
//query 1:-> 
connection.createStatement().executeUpdate("UPSERT INTO tableName VALUES('1','randomValue','randomValue',1234567890, 'randomValue', 'randomValue')");
//query 2:->
connection.createStatement().executeUpdate("CREATE TABLE IF NOT EXISTS tableName (A VARCHAR(40), Z.B.type VARCHAR, Z.C VARCHAR, Z.D UNSIGNED_LONG, Z.E VARCHAR,X.F VARCHAR CONSTRAINT rowkey PRIMARY KEY (A))");
//query 3:-> 
connection.commit();
}
}

错误:线程"流式处理作业执行程序-0"中的异常 java.lang.NoSuchMethodError: org.apache.hadoop.hbase.KeyValueUtil.length(Lorg/apache/hadoop/hbase/Cell;(我 在 org.apache.phoenix.util.PhoenixKeyValueUtil.calculateMutationDiskSize(PhoenixKeyValueUtil.java:182( 在 org.apache.phoenix.execute.MutationState.calculateMutationSize(MutationState.java:800( 在 org.apache.phoenix.execute.MutationState.send(MutationState.java:971( 在 org.apache.phoenix.execute.MutationState.send(MutationState.java:1344( 在 org.apache.phoenix.execute.MutationState.commit(MutationState.java:1167( 在 org.apache.phoenix.jdbc.PhoenixConnection$3.call(PhoenixConnection.java:670( 在 org.apache.phoenix.jdbc.PhoenixConnection$3.call(PhoenixConnection.java:666( at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53( at org.apache.phoenix.jdbc.PhoenixConnection.commit(PhoenixConnection.java:666( 在 com.kratinmobile.uep.services.SparkStream.lambda$null$0(SparkStream.java:119( at java.lang.Iterable.forEach(Iterable.java:75( at com.kratinmobile.uep.services.SparkStream.lambda$startStreaming$10899135$1(SparkStream.java:102( 在 org.apache.spark.streaming.api.java.JavaDStreamLike$$anonfun$foreachRDD$1.apply(JavaDStreamLike.scala:272( 在 org.apache.spark.streaming.api.java.JavaDStreamLike$$anonfun$foreachRDD$1.apply(JavaDStreamLike.scala:272( 在 org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1$$anonfun$apply$mcV$sp$3.apply(DStream.scala:628( 在 org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1$$anonfun$apply$mcV$sp$3.apply(DStream.scala:628( 在 org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:51( 在 org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51( 在 org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51( 在 org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:416( 在 org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:50( 在 org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50( 在 org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50( at scala.util.Try$.apply(Try.scala:192( at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39( at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:257( 在 org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257( 在 org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257( at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58( at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:256( 在 java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149( 在 java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624( at java.lang.Thread.run(Thread.java:748(

查看堆栈跟踪,它很可能看起来像类路径不匹配或版本不匹配。

最新更新