我想验证PySpark数据帧的日期列。我知道如何为熊猫做这件事,但不能让它为PySpark工作。
import pandas as pd
import datetime
from datetime import datetime
data = [['Alex',10, '2001-01-12'],['Bob',12, '2005-10-21'],['Clarke',13, '2003-12-41']]
df = pd.DataFrame(data,columns=['Name','Sale_qty', 'DOB'])
sparkDF =spark.createDataFrame(df)
def validate(date_text):
try:
if date_text != datetime.strptime(date_text, "%Y-%m-%d").strftime('%Y-%m-%d'):
raise ValueError
return True
except ValueError:
return False
df = df['DOB'].apply(lambda x: validate(x))
print(df)
它适用于pandas数据帧。但我不能让它为PySpark工作。得到以下错误:
sparkDF = sparkDF['DOB'].apply(lambda x: validate(x))
TypeError Traceback (most recent call last) <ipython-input-83-5f5f1db1c7b3> in <module> ----> 1 sparkDF = sparkDF['DOB'].apply(lambda x: validate(x)) TypeError: 'Column' object is not callable
您可以使用以下列表达式:
F.to_date('DOB', 'yyyy-M-d').isNotNull()
完整测试:
from pyspark.sql import functions as F
data = [['Alex', 10, '2001-01-12'], ['Bob', 12, '2005'], ['Clarke', 13, '2003-12-41']]
df = spark.createDataFrame(data, ['Name', 'Sale_qty', 'DOB'])
validation = F.to_date('DOB', 'yyyy-M-d').isNotNull()
df.withColumn('validation', validation).show()
# +------+--------+----------+----------+
# | Name|Sale_qty| DOB|validation|
# +------+--------+----------+----------+
# | Alex| 10|2001-01-12| true|
# | Bob| 12| 2005| false|
# |Clarke| 13|2003-12-41| false|
# +------+--------+----------+----------+
您可以使用具有所需源日期格式的to_date()
。它返回格式不正确的null
,可用于验证。
参见以下示例。
spark.sparkContext.parallelize([('01-12-2001',), ('2001-01-12',)]).toDF(['dob']).
withColumn('correct_date_format', func.to_date('dob', 'yyyy-MM-dd').isNotNull()).
show()
# +----------+-------------------+
# | dob|correct_date_format|
# +----------+-------------------+
# |01-12-2001| false|
# |2001-01-12| true|
# +----------+-------------------+