SparkSQL:忽略无效的 json 文件



我正在使用SparkSQL加载一堆JSON文件,但有些文件有问题。

我想继续处理其他文件,同时忽略坏文件,我该怎么做?

我尝试使用 try-catch,但它仍然失败。 示例:

try {
    val sqlContext = new org.apache.spark.sql.SQLContext(sc)
    import sqlContext._
    val jsonFiles=sqlContext.jsonFile("/requests.loading")
} catch {
    case _: Throwable => // Catching all exceptions and not doing anything with them
}

我在以下方面失败了:

14/11/20 01:20:44 INFO scheduler.TaskSetManager: Starting task 3065.0 in stage 1.0 (TID 6150, HDdata2, NODE_LOCAL, 1246 bytes)<BR>
14/11/20 01:20:44 WARN scheduler.TaskSetManager: Lost task 3027.1 in stage 1.0 (TID 6130, HDdata2): com.fasterxml.jackson.core.JsonParseException: Unexpected end-of-input: was expecting closing quote for a string value
 at [Source: java.io.StringReader@753ab9f1; line: 1, column: 1805]

如果您使用的是 Spark 1.2,Spark SQL 将为您处理这些损坏的 JSON 记录。下面是一个示例...

// requests.loading has some broken records
val jsonFiles=sqlContext.jsonFile("/requests.loading")
// Look at the schema of jsonFiles, you will see a new column called "_corrupt_record", which holds all broken JSON records
// jsonFiles.printSchema
// Register jsonFiles as a table
jsonFiles.registerTempTable("jsonTable")
// To query all normal records
sqlContext.sql("SELECT * FROM jsonTable WHERE _corrupt_record IS NULL")
// To query all broken JSON records
sqlContext.sql("SELECT _corrupt_record FROM jsonTable WHERE _corrupt_record IS NOT NULL")

相关内容

  • 没有找到相关文章

最新更新