Spark日期解析



我正在使用以下函数解析以下格式的一些日期:2009-01-23 18:15:05

  def loadTransactions (sqlContext: SQLContext, path: String): DataFrame = {
    val rowRdd = sqlContext.sparkContext.textFile(path).map { line =>
      val tokens = line.split(',')
      val dt = new DateTime(tokens(0))
      Row(new Timestamp(dt.getMillis))
    }
    val fields = Seq(
      StructField("timestamp", TimestampType, true)
    )
    val schema = StructType(fields)
    sqlContext.createDataFrame(rowRdd, schema)
  }

Spark抛出错误:

java.lang.IllegalArgumentException: Invalid format: "2009-01-23 18:15:05" is malformed at " 18:15:05" at org.joda.time.format.DateTimeParserBucket.doParseMillis

我认为这是由于毫秒丢失

这样的东西怎么样?

import org.apache.spark.sql.functions.regexp_extract
def loadTransactions (sqlContext: SQLContext, path: String): DataFrame = {
  sqlContext.sparkContext.textFile(path).toDF("text").select(
    regexp_extract($"text", "^(.*?),", 1).cast("timestamp").alias("timestamp"))
}

您可以使用以下方法而不是使用jodatime

    def loadTransactions (sqlContext: SQLContext, path: String): DataFrame = {
    val rowRdd = sqlContext.sparkContext.textFile(path).map { line =>
      val tokens = line.split(',')
      Row(getTimestamp(tokens(0)))
    }
    val fields = Seq(
      StructField("timestamp", TimestampType, true)
    )
    val schema = StructType(fields)
    sqlContext.createDataFrame(rowRdd, schema)
  }

使用以下函数转换为时间戳。

  def getTimestamp(x:String) :java.sql.Timestamp = {
    val format = new SimpleDateFormat("yyyy-MM-dd hh:mm:ss")
    if (x.toString() == "")
      return null
    else {
      val d = format.parse(x.toString());
      val t = new Timestamp(d.getTime());
      return t
    }
  }

最新更新