我正试图从一个Spark数据框架写信给Cassandra。当我有一个简单的数据框架模式时,如示例中所示,它可以工作:
root
|-- id: string (nullable = true)
|-- url: string (nullable = true)
但是,当我尝试用这样的模式编写包含StructTypes的数据框架时:
root
|-- crawl: struct (nullable = true)
| |-- id: string (nullable = true)
然后我得到以下异常:
Exception in thread "main" java.lang.IllegalArgumentException: Unsupported type: StructType(StructField(id,StringType,true))
at com.datastax.spark.connector.types.ColumnType$.unsupportedType$1(ColumnType.scala:132)
at com.datastax.spark.connector.types.ColumnType$.fromSparkSqlType(ColumnType.scala:155)
at com.datastax.spark.connector.mapper.DataFrameColumnMapper$$anonfun$1.apply(DataFrameColumnMapper.scala:18)
at com.datastax.spark.connector.mapper.DataFrameColumnMapper$$anonfun$1.apply(DataFrameColumnMapper.scala:16)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.immutable.List.foreach(List.scala:318)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
at scala.collection.AbstractTraversable.map(Traversable.scala:105)
at com.datastax.spark.connector.mapper.DataFrameColumnMapper.newTable(DataFrameColumnMapper.scala:16)
at com.datastax.spark.connector.cql.TableDef$.fromDataFrame(Schema.scala:215)
at com.datastax.spark.connector.DataFrameFunctions.createCassandraTable(DataFrameFunctions.scala:26)
我的代码是这样的:
val df = sqlContext.read.parquet(input)
df.createCassandraTable(keyspace, table)
df.write
.format("org.apache.spark.sql.cassandra")
.options(Map("table" -> table, "keyspace" -> keyspace))
.save()
帮助吗?
看起来连接器目前不支持从DataFrame Structs动态创建UDT类型。您应该向Spark Cassandra Connector Jira添加一个Ticket,作为一个特性请求。在准备好之前,您总是可以手动创建一个新类型来匹配您的结构类型。