从RDD创建df时出现pyspark错误:类型错误:无法推断类型:<类型'float'的模式>



我使用以下代码将rdd转换为数据帧:

time_df = time_rdd.toDF(['my_time'])
并得到以下错误:
TypeErrorTraceback (most recent call last)
<ipython-input-40-ab9e3025f679> in <module>()
----> 1 time_df = time_rdd.toDF(['my_time'])
/usr/local/spark-latest/python/pyspark/sql/session.py in toDF(self, schema, sampleRatio)
     55         [Row(name=u'Alice', age=1)]
     56         """
---> 57         return sparkSession.createDataFrame(self, schema, sampleRatio)
     58 
     59     RDD.toDF = toDF
/usr/local/spark-latest/python/pyspark/sql/session.py in createDataFrame(self, data, schema, samplingRatio)
    518 
    519         if isinstance(data, RDD):
--> 520             rdd, schema = self._createFromRDD(data.map(prepare), schema, samplingRatio)
    521         else:
    522             rdd, schema = self._createFromLocal(map(prepare, data), schema)
/usr/local/spark-latest/python/pyspark/sql/session.py in _createFromRDD(self, rdd, schema, samplingRatio)
    358         """
    359         if schema is None or isinstance(schema, (list, tuple)):
--> 360             struct = self._inferSchema(rdd, samplingRatio)
    361             converter = _create_converter(struct)
    362             rdd = rdd.map(converter)
/usr/local/spark-latest/python/pyspark/sql/session.py in _inferSchema(self, rdd, samplingRatio)
    338 
    339         if samplingRatio is None:
--> 340             schema = _infer_schema(first)
    341             if _has_nulltype(schema):
    342                 for row in rdd.take(100)[1:]:
/usr/local/spark-latest/python/pyspark/sql/types.py in _infer_schema(row)
    987 
    988     else:
--> 989         raise TypeError("Can not infer schema for type: %s" % type(row))
    990 
    991     fields = [StructField(k, _infer_type(v), True) for k, v in items]
TypeError: Can not infer schema for type: <type 'float'>
有谁知道我错过了什么吗?谢谢!

应该将float转换为元组,如

time_rdd.map(lambda x: (x, )).toDF(['my_time'])

检查time_rdd是否为RDD

你得到了什么?

>>>type(time_rdd)
>>>dir(time_rdd)

相关内容

  • 没有找到相关文章

最新更新