我有一个数据框。是这样的——
|-- Col1 : string (nullable = true)
|-- Col2 : string (nullable = true)
|-- Col3 : struct (nullable = true)
| |-- 513: long (nullable = true)
| |-- 549: long (nullable = true)
通过使用-
df.select("Col1","Col2","Col3.*").show
+-----------+--------+------+------+
| Col1| Col1| 513| 549|
+-----------+--------+------+------+
| AAAAAAAAA | BBBBB | 39| 38|
+-----------+--------+------+------+
现在我想重命名它
+-----------+--------+---------+--------+
| Col1| Col1| Col3=513|Col3=549|
+-----------+--------+---------+--------+
| AAAAAAAAA | BBBBB | 39| 38|
+-----------+--------+---------+--------+
结构内部的列是动态的。所以我不能使用withColumnRenamed
当您询问重命名 insude 结构时,您可以使用架构 DSL 来实现这一点:
import org.apache.spark.sql.types._
val schema: StructType = df.schema.fields.find(_.name=="Col3").get.dataType.asInstanceOf[StructType]
val newSchema = StructType.apply(schema.fields.map(sf => StructField.apply("Col3="+sf.name,sf.dataType)))
df
.withColumn("Col3",$"Col3".cast(newSchema))
.printSchema()
给
root
|-- Col1: string (nullable = true)
|-- Col2: string (nullable = true)
|-- Col3: struct (nullable = false)
| |-- Col3=513: long (nullable = true)
| |-- Col3=549: long (nullable = true)
然后你可以使用 select($"col3.*")
.
您也可以先解压缩结构,然后重命名所有以数字作为列名的列...