凤凰城"org.apache.phoenix.spark.DefaultSource"错误



我是Phoenix的新手,我正在尝试将hbase表加载到Phoenix中。当我尝试加载Phoenix时,我要低于错误。

java.lang.ClassNotFoundException: org.apache.phoenix.spark.DefaultSource

我的代码:

package com.vas.reports
import org.apache.spark.SparkContext
import org.apache.spark.sql.{SQLContext, SaveMode}
import org.apache.phoenix.spark
import java.sql.DriverManager
import com.google.common.collect.ImmutableMap
import org.apache.hadoop.hbase.filter.FilterBase
import org.apache.phoenix.query.QueryConstants
import org.apache.phoenix.filter.ColumnProjectionFilter;
import org.apache.phoenix.hbase.index.util.ImmutableBytesPtr;
import org.apache.phoenix.hbase.index.util.VersionUtil;
import org.apache.hadoop.hbase.filter.Filter

object PhoenixRead {
case class Record(NO:Int,NAME:String,DEPT:Int)

def main(args: Array[String]) {
val sc= new SparkContext("local","phoenixsample")
val sqlcontext=new SQLContext(sc)
val numWorkers = sc.getExecutorStorageStatus.map(_.blockManagerId.executorId).filter(_ != "driver").length
import sqlcontext.implicits._

val df1=sc.parallelize(List((2,"Varun", 58),
(3,"Alice", 45),
(4,"kumar", 55))).
toDF("NO", "NAME", "DEPT")

df1.show()
println(numWorkers)
println("pritning df2")
val df =sqlcontext.load("org.apache.phoenix.spark",Map("table"->"udm_main","zkUrl"->"phoenix url:2181/hbase-unsecure"))
df.show()

Spark-Submit~~~~~~~~~~~~

spark-submit --class com.vas.reports.PhoenixRead --jars /home/hadoop1/phoenix-core-4.4.0-HBase-1.1.jar /shared/test/ratna-0.0.1-SNAPSHOT.jar

请调查并建议我。

这是因为,您需要在hbase_home/libs和spark_home/lib中添加以下库文件。

在hbase_home/libs中:

  • Phoenix-Spark-4.7.0-Hbase-1.1.jar
  • Phoenix-4.7.0-Hbase-1.1-Server.jar

spark_home/lib:

  • Phoenix-Spark-4.7.0-Hbase-1.1.jar
  • Phoenix-4.7.0-Hbase-1.1-client.jar

最新更新