当我试图通过RDD转换为spark中的Dataframe时,我得到以下异常"无法推断类型的模式:"
示例:
>> rangeRDD.take(1).foreach(println)
(301,301,10)
>> sqlContext.inferSchema(rangeRDD)
Can not infer schema for type: <type 'unicode'>
有指针怎么修吗?我甚至尝试在sqlContext.createDataFrame(rdd,schema)中注入模式
schema = StructType([
StructField("x", IntegerType(), True),
StructField("y", IntegerType(), True),
StructField("z", IntegerType(), True)])
df = sqlContext.createDataFrame(rangeRDD, schema)
print df.first()
但最终出现运行时错误"ValueError:意外元组u"(301301,10),结构类型为""
尝试首先解析数据
>>> rangeRDD = sc.parallelize([ u'(301,301,10)'])
>>> tupleRangeRDD = rangeRDD.map(lambda x: x[1:-1])
... .map(lambda x: x.split(","))
... .map(lambda x: [int(y) for y in x])
>>> df = sqlContext.createDataFrame(tupleRangeRDD, schema)
>>> df.first()
Row(x=301, y=301, z=10)