我是CouchBase的新手。我试图在本地模式下将数据写入CouchBase。我的示例代码如下,
val cfg = new SparkConf()
.setAppName("couchbaseQuickstart")
.setMaster("local[*]")
.set("com.couchbase.bucket.MyBucket","pwd")
val sc = new SparkContext(cfg)
val doc1 = JsonDocument.create("doc1", JsonObject.create().put("some","content"))
val doc2 = JsonArrayDocument.create("doc2", JsonArray.from("more", "content", "in", "here"))
val data = sc.parallelize(Seq(doc1, doc2))
但是我不能访问data.saveToCouchbase().
我正在使用Spark 1.6.1 &Scala 2.11.8
我在build .sbt
中给出了以下依赖项libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.6.1"
libraryDependencies += "com.couchbase.client" % "spark-connector_2.11" % "1.2.1"
如何使用Spark将数据写入CouchBase;Scala吗?
看起来您只是缺少一个import语句,该语句将使您能够在rdd和数据框架上使用Couchbase函数:
import com.couchbase.spark._
val cfg = new SparkConf()
.setAppName("couchbaseQuickstart")
.setMaster("local[*]")
.set("com.couchbase.bucket.MyBucket","pwd")
val sc = new SparkContext(cfg)
val doc1 = JsonDocument.create("doc1",
JsonObject.create().put("some","content"))
val doc2 = JsonArrayDocument.create("doc2", JsonArray.from("more", "content", "in", "here"))
val data = sc.parallelize(Seq(doc1, doc2))
data.saveToCouchbase()