无法从 Spark 连接到 cassandra



我的Cassandra中有一些测试数据。我正在尝试从 Spark 获取此数据,但出现如下错误:

py4j.protocol.Py4JJavaError: An error occurred while calling o25.load.
java.io.IOException: Failed to open native connection to Cassandra at {127.0.1.1}:9042

这是我到目前为止所做的:

  1. 开始./bin/cassandra
  2. 使用带有keyspace ="testkeyspace2"table="emp"以及一些键和相应值的cql创建测试数据。
  3. 写 standalone.py
  4. 已运行以下 pyspark shell 命令。

    sudo ./bin/spark-submit --jars spark-streaming-kafka-assembly_2.10-1.6.0.jar 
    --packages TargetHolding:pyspark-cassandra:0.2.4 
    examples/src/main/python/standalone.py
    
  5. 得到提到的错误。


standalone.py:

from pyspark import SparkContext, SparkConf
from pyspark.sql import SQLContext
conf = SparkConf().setAppName("Stand Alone Python Script")
sc = SparkContext(conf=conf)
sqlContext = SQLContext(sc)
loading=sqlContext.read.format("org.apache.spark.sql.cassandra")
                        .options(table="emp", keyspace = "testkeyspace2")
                        .load()
                        .show()

我也尝试过--packages datastax:spark-cassandra-connector:1.5.0-RC1-s_2.11但我遇到了同样的错误。


调试:

我检查了

netstat -tulpn | grep -i listen | grep <cassandra_pid>

并看到它正在侦听端口 9042。


完整日志跟踪:

Traceback (most recent call last):
  File "~/Dropbox/Work/ITNow/spark/spark-1.6.0/examples/src/main/python/standalone.py", line 8, in <module>
    .options(table="emp", keyspace = "testkeyspace2")
  File "~/Dropbox/Work/ITNow/spark/spark-1.6.0/python/lib/pyspark.zip/pyspark/sql/readwriter.py", line 139, in load
  File "~/Dropbox/Work/ITNow/spark/spark-1.6.0/python/lib/py4j-0.9-src.zip/py4j/java_gateway.py", line 813, in __call__
  File "~/Dropbox/Work/ITNow/spark/spark-1.6.0/python/lib/pyspark.zip/pyspark/sql/utils.py", line 45, in deco
  File "~/Dropbox/Work/ITNow/spark/spark-1.6.0/python/lib/py4j-0.9-src.zip/py4j/protocol.py", line 308, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling o25.load.
: java.io.IOException: Failed to open native connection to Cassandra at {127.0.1.1}:9042
    at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:164)
    at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:150)
    at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:150)
    at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:31)
    at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:56)
    at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:81)
    at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:109)
    at com.datastax.spark.connector.rdd.partitioner.CassandraRDDPartitioner$.getTokenFactory(CassandraRDDPartitioner.scala:176)
    at org.apache.spark.sql.cassandra.CassandraSourceRelation$.apply(CassandraSourceRelation.scala:203)
    at org.apache.spark.sql.cassandra.DefaultSource.createRelation(DefaultSource.scala:57)
    at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:158)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:119)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231)
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:381)
    at py4j.Gateway.invoke(Gateway.java:259)
    at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133)
    at py4j.commands.CallCommand.execute(CallCommand.java:79)
    at py4j.GatewayConnection.run(GatewayConnection.java:209)
    at java.lang.Thread.run(Thread.java:745)
Caused by: com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (tried: /127.0.1.1:9042 (com.datastax.driver.core.TransportException: [/127.0.1.1:9042] Cannot connect))
    at com.datastax.driver.core.ControlConnection.reconnectInternal(ControlConnection.java:227)
    at com.datastax.driver.core.ControlConnection.connect(ControlConnection.java:82)
    at com.datastax.driver.core.Cluster$Manager.init(Cluster.java:1307)
    at com.datastax.driver.core.Cluster.getMetadata(Cluster.java:339)
    at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:157)
    ... 22 more

我做错了什么吗?

我对这一切真的很陌生,所以我可以使用一些建议。谢谢!

根据我们在问题评论中的对话,问题是"localhost"用于rpc_address您的 cassandra.yaml 文件中。 Cassandra 使用操作系统将"localhost"解析为 127.0.0.1,并显式侦听该接口。

要解决此问题,您需要在cassandra.yaml中将rpc_address更新到127.0.1.1并重新启动cassandra或将SparkConf更新为引用127.0.0.1,即:

conf = SparkConf().setAppName("Stand Alone Python Script")
                  .set("spark.cassandra.connection.host", "127.0.0.1")

虽然对我来说似乎很奇怪的一件事是 spark.cassandra.connection.host 也默认为"localhost",所以对我来说很奇怪,spark cassandra 连接器将"localhost"解析为"127.0.1.1",而 cassandra 将其解析为"127.0.0.1"。

我在/etc/hosts中检查了我的 linux 主机文件,内容就像

127.0.0.1       localhost
127.0.1.1       <my hostname>

我把它改成:

127.0.0.1       localhost
127.0.0.1       <my hostname>

而且效果很好。

正如您在自己的日志文件中看到的那样line number 58它提到了Your hostname, ganguly resolves to a loopback address: 127.0.1.1; using 192.168.1.32 instead (on interface wlan0)我想这也适用于您的情况。

在你的 --packages 依赖项旁边添加这个,它对我非常有效。--conf spark.cassandra.connection.host="127.0.0.1"

相关内容

  • 没有找到相关文章

最新更新