,所以我有以下数据集,其中包括日期格式的月份,年。
df = spark.read.format('csv').options(header = 'true').load("D:\datasets\googleplaystore.csv")
df.select('App', 'Last Updated').show()
我得到输出
+--------------------+------------------+
| App| Last Updated|
+--------------------+------------------+
|Photo Editor & Ca...| January 7, 2018|
| Coloring book moana| January 15, 2018|
|U Launcher Lite –...| August 1, 2018|
|Sketch - Draw & P...| June 8, 2018|
|Pixel Draw - Numb...| June 20, 2018|
|Paper flowers ins...| March 26, 2017|
|Smoke Effect Phot...| April 26, 2018|
| Infinite Painter| June 14, 2018|
|Garden Coloring Book|September 20, 2017|
|Kids Paint Free -...| July 3, 2018|
|Text on Photo - F...| October 27, 2017|
|Name Art Photo Ed...| July 31, 2018|
|Tattoo Name On My...| April 2, 2018|
|Mandala Coloring ...| June 26, 2018|
|3D Color Pixel by...| August 3, 2018|
|Learn To Draw Kaw...| June 6, 2018|
当我尝试将此日期转换为特定格式时,请说" yyyymmdd"
df.select('App', date_format(('Last Updated'), "yyyyMMdd").alias("date")).show()
我得到
+--------------------+----+
| App|date|
+--------------------+----+
|Photo Editor & Ca...|null|
| Coloring book moana|null|
|U Launcher Lite –...|null|
|Sketch - Draw & P...|null|
|Pixel Draw - Numb...|null|
|Paper flowers ins...|null|
|Smoke Effect Phot...|null|
| Infinite Painter|null|
|Garden Coloring Book|null|
|Kids Paint Free -...|null|
|Text on Photo - F...|null|
|Name Art Photo Ed...|null|
|Tattoo Name On My...|null|
|Mandala Coloring ...|null|
|3D Color Pixel by...|null|
|Learn To Draw Kaw...|null|
|Photo Designer - ...|null|
|350 Diy Room Deco...|null|
不确定我要在哪里出错。请帮忙。
我也想知道如何使用日期过滤。我知道我应该使用lit(),lt,gt ..但是我不确定此数据集的正确语法。
任何帮助都将被申请。
谢谢
这是两个要点的完整解决方案: -
第一个问题是日期解析 -
date_format
接受日期列,并将其格式化为任何组合。但是这里Last Updated
是一个字符串列。要转换string
中的CC_3,它需要to_date
。在下面查看我将string
解析至date
。
data = sqlContext.createDataFrame([
["Photo Editor & Ca...", " January 7, 2018"],
[" Coloring book moana", " January 15, 2018"],
["U Launcher Lite –...", " August 1, 2018"],
["ketch - Draw & P...", " June 8, 2018"],
["Pixel Draw - Numb...", " June 20, 2018"],
["Paper flowers ins...", " March 26, 2017"],
["moke Effect Phot...", " April 26, 2018"],
[" Infinite Painter", " June 14, 2018"],
["Garden Coloring Book", "September 20, 2017"],
["Kids Paint Free -...", " July 3, 2018"],
["Text on Photo - F...", " October 27, 2017"],
["Name Art Photo Ed...", " July 31, 2018"],
["Tattoo Name On My...", " April 2, 2018"],
["Mandala Coloring ...", " June 26, 2018"],
["3D Color Pixel by...", " August 3, 2018"],
["Learn To Draw Kaw...", " June 6, 2018"]
], ["app", "Last Updated"])
from pyspark.sql import functions as F
parsed_date_data = data.withColumn(
"date",
F.to_date(
F.trim(F.col("Last Updated")),
"MMMM dd, yyyy"
)
)
parsed_date_data.show()
+--------------------+------------------+----------+
| app| Last Updated| date|
+--------------------+------------------+----------+
|Photo Editor & Ca...| January 7, 2018|2018-01-07|
| Coloring book moana| January 15, 2018|2018-01-15|
|U Launcher Lite â...| August 1, 2018|2018-08-01|
| ketch - Draw & P...| June 8, 2018|2018-06-08|
|Pixel Draw - Numb...| June 20, 2018|2018-06-20|
|Paper flowers ins...| March 26, 2017|2017-03-26|
| moke Effect Phot...| April 26, 2018|2018-04-26|
| Infinite Painter| June 14, 2018|2018-06-14|
|Garden Coloring Book|September 20, 2017|2017-09-20|
|Kids Paint Free -...| July 3, 2018|2018-07-03|
|Text on Photo - F...| October 27, 2017|2017-10-27|
|Name Art Photo Ed...| July 31, 2018|2018-07-31|
|Tattoo Name On My...| April 2, 2018|2018-04-02|
|Mandala Coloring ...| June 26, 2018|2018-06-26|
|3D Color Pixel by...| August 3, 2018|2018-08-03|
|Learn To Draw Kaw...| June 6, 2018|2018-06-06|
+--------------------+------------------+----------+
第二我们如何将过滤器应用于数据框架 -
parsed_date_data.where("date = '2018-01-07'").show()
+--------------------+------------------+----------+
| app| Last Updated| date|
+--------------------+------------------+----------+
|Photo Editor & Ca...| January 7, 2018|2018-01-07|
+--------------------+------------------+----------+
parsed_date_data.filter("date = '2018-01-07'").show()
+--------------------+------------------+----------+
| app| Last Updated| date|
+--------------------+------------------+----------+
|Photo Editor & Ca...| January 7, 2018|2018-01-07|
+--------------------+------------------+----------+
parsed_date_data.where(F.col("date") == '2018-01-07').show()
+--------------------+------------------+----------+
| app| Last Updated| date|
+--------------------+------------------+----------+
|Photo Editor & Ca...| January 7, 2018|2018-01-07|
+--------------------+------------------+----------+
parsed_date_data.filter(F.col("date") == '2018-01-07').show()
+--------------------+------------------+----------+
| app| Last Updated| date|
+--------------------+------------------+----------+
|Photo Editor & Ca...| January 7, 2018|2018-01-07|
+--------------------+------------------+----------+
parsed_date_data.filter(parsed_date_data.date == '2018-01-07').show()
+--------------------+------------------+----------+
| app| Last Updated| date|
+--------------------+------------------+----------+
|Photo Editor & Ca...| January 7, 2018|2018-01-07|
+--------------------+------------------+----------+
parsed_date_data.where(parsed_date_data.date == '2018-01-07').show()
+--------------------+------------------+----------+
| app| Last Updated| date|
+--------------------+------------------+----------+
|Photo Editor & Ca...| January 7, 2018|2018-01-07|
+--------------------+------------------+----------+
parsed_date_data.where(parsed_date_data.date.isin('2018-01-07')).show()
+--------------------+------------------+----------+
| app| Last Updated| date|
+--------------------+------------------+----------+
|Photo Editor & Ca...| January 7, 2018|2018-01-07|
+--------------------+------------------+----------+
parsed_date_data.filter(parsed_date_data.date.isin('2018-01-07')).show()
+--------------------+------------------+----------+
| app| Last Updated| date|
+--------------------+------------------+----------+
|Photo Editor & Ca...| January 7, 2018|2018-01-07|
+--------------------+------------------+----------+
即使您可以应用子过滤器 -
parsed_date_data.filter(F.month(parsed_date_data.date) == '08').show()
+--------------------+------------------+----------+
| app| Last Updated| date|
+--------------------+------------------+----------+
|U Launcher Lite â...| August 1, 2018|2018-08-01|
|3D Color Pixel by...| August 3, 2018|2018-08-03|
+--------------------+------------------+----------+
这是理解pyspark函数的完整API。
您遇到的问题是因为date_format
期望current_date()
。
因此,您首先需要使用功能to_date
将2018年1月7日转换为date
。
scala> val df1 = df.withColumn("date format",to_date($"Last Updated","MMMMMM dd, yyyy"))
df1: org.apache.spark.sql.DataFrame = [App: string, Last Updated: string ... 1 more field]
scala> df1.show()
+-----------------+---------------+-----------+
| App| Last Updated|date format|
+-----------------+---------------+-----------+
|Photo Editor & Ca|January 7, 2018| 2018-01-07|
+-----------------+---------------+-----------+
然后应用date_format
。
scala> val df2 = df1.withColumn("date",date_format($"date format","yyyyMMdd"))
df2: org.apache.spark.sql.DataFrame = [App: string, Last Updated: string ... 2 more fields]
scala> df2.show()
+-----------------+---------------+-----------+--------+
| App| Last Updated|date format| date|
+-----------------+---------------+-----------+--------+
|Photo Editor & Ca|January 7, 2018| 2018-01-07|20180107|
+-----------------+---------------+-----------+--------+
参考:
https://docs-snaplogic.atlassian.net/wiki/spaces/sd/pages/2458071/date functions and 和