在Spark SQL中将多个结构组合为一个结构



这是我的输入:

val df = Seq(
("Adam","Angra", "Anastasia"),
("Boris","Borun", "Bisma"),
("Shawn","Samar", "Statham")
).toDF("fname", "mname", "lname")
df.createOrReplaceTempView("df")

我希望Spark sql输出如下所示:

struct
{"data_description":"fname","data_details":"Adam"},{"data_description":"mname","data_details":"Angra"},{"data_description":"lname","data_details":"Anastasia"}
{"data_description":"fname","data_details":"Boris"},{"data_description":"mname","data_details":"Borun"},{"data_description":"lname","data_details":"Bisma"}
{"data_description":"fname","data_details":"Shawn"},{"data_description":"mname","data_details":"Samar"},{"data_description":"lname","data_details":"Statham"}

到目前为止,我尝试了以下内容:

val df1 = spark.sql("""select concat(fname,':',mname,":",lname) as name from df""")
df1.createOrReplaceTempView("df1")
val df2 = spark.sql("""select named_struct('data_description','fname','data_details',split(name, ':')[0]) as struct1,named_struct('data_description','mname','data_details',split(name, ':')[1]) as struct2, named_struct('data_description','lname','data_details',split(name, ':')[2]) as struct3 from df1""")
df2.createOrReplaceTempView("df2")

以上输出:

struct1 struct2 struct3
{"data_description":"fname","data_details":"Adam"}  {"data_description":"mname","data_details":"Angra"} {"data_description":"lname","data_details":"Anastasia"}
{"data_description":"fname","data_details":"Boris"} {"data_description":"mname","data_details":"Borun"} {"data_description":"lname","data_details":"Bisma"}
{"data_description":"fname","data_details":"Shawn"} {"data_description":"mname","data_details":"Samar"} {"data_description":"lname","data_details":"Statham"}

但我有三种不同的结构。我需要一个由逗号分隔的结构

sql语句如下,其他如您所知。

val sql = """
select
concat_ws(
','
,concat('{"data_description":"fname","data_details":"',fname,'"}')
,concat('{"data_description":"mname","data_details":"',mname,'"}')
,concat('{"data_description":"lname","data_details":"',lname,'"}')
) as struct
from df
"""

您可以创建结构数组,然后如果您希望输出为字符串,则使用to_json

spark.sql("""
select  to_json(array(
named_struct('data_description','fname','data_details', fname),
named_struct('data_description','mname','data_details', mname), 
named_struct('data_description','lname','data_details', lname) 
)) as struct
from  df
""").show()
//+----------------------------------------------------------------------------------------------------------------------------------------------------------------+
//|struct                                                                                                                                                          |
//+----------------------------------------------------------------------------------------------------------------------------------------------------------------+
//|[{"data_description":"fname","data_details":"Adam"},{"data_description":"mname","data_details":"Angra"},{"data_description":"lname","data_details":"Anastasia"}]|
//|[{"data_description":"fname","data_details":"Boris"},{"data_description":"mname","data_details":"Borun"},{"data_description":"lname","data_details":"Bisma"}]   |
//|[{"data_description":"fname","data_details":"Shawn"},{"data_description":"mname","data_details":"Samar"},{"data_description":"lname","data_details":"Statham"}] |
//+----------------------------------------------------------------------------------------------------------------------------------------------------------------+

如果你有很多列,你可以动态生成如下结构sql表达式:

val structs = df.columns.map(c => s"named_struct('data_description','$c','data_details', $c)").mkString(",")
val df2 = spark.sql(s"""
select  to_json(array($structs)) as struct
from  df
""")

如果你不想使用数组,你可以简单地将to_json的结果连接在3个结构上:

val structs = df.columns.map(c => s"to_json(named_struct('data_description','$c','data_details', $c))").mkString(",")
val df2 = spark.sql(s"""
select  concat_ws(',', $structs) as struct
from  df
""")

最新更新