NoSuchElementException: key not found: 'int' with Spark Cassandra



我在使用 Cassandra 3.0.5 和 Scala 2.10 时收到以下错误:

Exception in thread "main" java.util.NoSuchElementException: key not found: 'int'
        at scala.collection.MapLike$class.default(MapLike.scala:228)
        at scala.collection.AbstractMap.default(Map.scala:58)
        at scala.collection.MapLike$class.apply(MapLike.scala:141)
        at scala.collection.AbstractMap.apply(Map.scala:58)
        at com.datastax.spark.connector.types.ColumnType$.fromDriverType(ColumnType.scala:81)
        at com.datastax.spark.connector.cql.ColumnDef$.apply(Schema.scala:117)
        at com.datastax.spark.connector.cql.Schema$$anonfun$com$datastax$spark$connector$cql$Schema$$fetchPartitionKey$1.apply(Schema.scala:199)
        at com.datastax.spark.connector.cql.Schema$$anonfun$com$datastax$spark$connector$cql$Schema$$fetchPartitionKey$1.apply(Schema.scala:198)
        at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
        at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
        at scala.collection.Iterator$class.foreach(Iterator.scala:727)
        at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
        at scala.collection.TraversableLike$WithFilter$$anonfun$map$2.apply(TraversableLike.scala:722)
        at scala.collection.immutable.HashSet$HashSet1.foreach(HashSet.scala:153)
        at scala.collection.immutable.HashSet$HashTrieSet.foreach(HashSet.scala:306)
        at scala.collection.TraversableLike$WithFilter.map(TraversableLike.scala:721)
        at com.datastax.spark.connector.cql.Schema$.com$datastax$spark$connector$cql$Schema$$fetchKeyspaces$1(Schema.scala:246)

以下是我的 Spark 依赖项:

<!--  Spark dependancies -->
  <dependency> 
    <groupId>org.apache.spark</groupId> 
    <artifactId>spark-core_2.10</artifactId> 
    <version>1.4.1</version> 
  </dependency> 
  <dependency> 
    <groupId>org.apache.spark</groupId> 
    <artifactId>spark-streaming_2.10</artifactId> 
    <version>1.4.1</version> 
  </dependency>  
   <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.10</artifactId>
        <version>1.4.1</version>
</dependency>
<!--  Connectors -->
    <dependency>
        <groupId>com.datastax.spark</groupId>
        <artifactId>spark-cassandra-connector_2.10</artifactId>
    <version>1.5.0-M3</version>
    </dependency>
    <dependency>
            <groupId>com.datastax.spark</groupId>
            <artifactId>spark-cassandra-connector-java_2.10</artifactId>
        <version>1.5.0-M2</version>
    </dependency>

还有我的 Java 代码:

SparkConf conf = new SparkConf();
          conf.setAppName("Java API demo");
          conf.setMaster("local");
          conf.set("spark.cassandra.connection.host", "localhost");
          conf.set("spark.cassandra.connection.port", "9042");
          conf.set("spark.cassandra.connection.timeout_ms", "40000");
          conf.set("spark.cassandra.read.timeout_ms", "200000");
          conf.set("spark.cassandra.auth.username", "username");
          conf.set("spark.cassandra.auth.password", "password");
          SimpleSpark app = new SimpleSpark(conf);
          app.run();

我相信我使用的版本是兼容的;是什么导致了这个错误?

请将您的 com.datastax.spark 连接器驱动程序更新为 1.5.0-RC1 而不是 1.5.0-M3。 这是 1.5.0-M3 中的错误。

<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector_2.10</artifactId>
<version>1.5.0-RC1</version>

最新更新