Scala Spark - 将数据从 1 个数据帧复制到另一个具有嵌套模式和相同列名的 DF 中



DF1-带数据的平面数据帧

+---------+--------+-------+                                                    
|FirstName|LastName| Device|
+---------+--------+-------+
|   Robert|Williams|android|
|    Maria|Sharpova| iphone|
+---------+--------+-------+
root
|-- FirstName: string (nullable = true)
|-- LastName: string (nullable = true)
|-- Device: string (nullable = true)

DF2-具有相同列名的空数据帧

+------+----+
|header|body|
+------+----+
+------+----+
root
|-- header: struct (nullable = true)
|    |-- FirstName: string (nullable = true)
|    |-- LastName: string (nullable = true)
|-- body: struct (nullable = true)
|    |-- Device: string (nullable = true)

DF2模式代码:

val schema = StructType(Array(
StructField("header", StructType(Array(
StructField("FirstName", StringType),
StructField("LastName", StringType)))), 
StructField("body", StructType(Array(
StructField("Device", StringType))))
))

具有来自DF1的数据的DF2将是最终输出。

需要对复杂模式的多列执行此操作,并使其可配置。必须在不使用case类的情况下执行此操作。


APPROACH#1-使用schema.fields.map映射DF1->DF2?

方法#2-创建一个新的DF并定义数据和模式?

方法#3-使用zip和map转换来定义"select col as col"查询。。不知道这是否适用于嵌套(structtype(模式

我该怎么做呢?

import spark.implicits._
import org.apache.spark.sql.functions._
val sourceDF = Seq(
("Robert", "Williams", "android"),
("Maria", "Sharpova", "iphone")
).toDF("FirstName", "LastName", "Device")
val resDF = sourceDF
.withColumn("header", struct('FirstName, 'LastName))
.withColumn("body", struct(col("Device")))
.select('header, 'body)
resDF.printSchema
//  root
//  |-- header: struct (nullable = false)
//  |    |-- FirstName: string (nullable = true)
//  |    |-- LastName: string (nullable = true)
//  |-- body: struct (nullable = false)
//  |    |-- Device: string (nullable = true)
resDF.show(false)
//  +------------------+---------+
//  |header            |body     |
//  +------------------+---------+
//  |[Robert, Williams]|[android]|
//  |[Maria, Sharpova] |[iphone] |
//  +------------------+---------+

最新更新