Spark SQL 2.0:有效PostgreSQL查询的NullPointerException



我有一个有效的PostgreSQL查询:当我复制/粘贴它在PSQL中,我得到所需的结果。
但是当我用Spark SQL运行时,它会导致NullPointerException

下面是导致错误的代码片段:

extractDataFrame().show()
private def extractDataFrame(): DataFrame = {
  val query =
    """(
      SELECT events.event_facebook_id, events.name, events.tariffrange,
        eventscounts.attending_count, eventscounts.declined_count, eventscounts.interested_count,
        eventscounts.noreply_count,
        artists.facebookid as artist_facebook_id, artists.likes as artistlikes,
        organizers.organizerid, organizers.likes as organizerlikes,
        places.placeid, places.capacity, places.likes as placelikes
      FROM events
        LEFT JOIN eventscounts on eventscounts.event_facebook_id = events.event_facebook_id
        LEFT JOIN eventsartists on eventsartists.event_id = events.event_facebook_id
          LEFT JOIN artists on eventsartists.artistid = artists.facebookid
        LEFT JOIN eventsorganizers on eventsorganizers.event_id = events.event_facebook_id
          LEFT JOIN organizers on eventsorganizers.organizerurl = organizers.facebookurl
        LEFT JOIN eventsplaces on eventsplaces.event_id = events.event_facebook_id
          LEFT JOIN places on eventsplaces.placefacebookurl = places.facebookurl
      ) df"""
  spark.sqlContext.read.jdbc(databaseURL, query, connectionProperties)
}

SparkSession的定义如下:

val databaseURL = "jdbc:postgresql://dbHost:5432/ticketapp" 
val spark = SparkSession
  .builder
  .master("local[*]")
  .appName("tariffPrediction")
  .getOrCreate()
val connectionProperties = new Properties
connectionProperties.put("user", "simon")
connectionProperties.put("password", "root")
下面是完整的堆栈跟踪:
[SparkException: Job aborted due to stage failure: Task 0 in stage 27.0 failed 1 times, most recent failure: Lost task 0.0 in stage 27.0 (TID 27, localhost): java.lang.NullPointerException
    at org.apache.spark.sql.catalyst.expressions.codegen.UnsafeRowWriter.write(UnsafeRowWriter.java:210)
    at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext(Unknown Source)
    at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
    at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$8$$anon$1.hasNext(WholeStageCodegenExec.scala:370)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$4.apply(SparkPlan.scala:246)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$4.apply(SparkPlan.scala:240)
    at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$24.apply(RDD.scala:784)
    at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$24.apply(RDD.scala:784)
    at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
    at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
    at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:70)
    at org.apache.spark.scheduler.Task.run(Task.scala:85)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
Driver stacktrace:]

最令人惊讶的部分是,如果我删除一个(无论哪一个)的LEFT JOIN子句在SQL查询,我没有得到任何错误…

我在Teradata数据源中遇到了非常类似的问题,它归结为DataFrame上的列可空性与底层数据不匹配(列的nullable=false,但某些行在该特定字段中具有空值)。在我的例子中,原因是Teradata JDBC Driver没有返回正确的列元数据。我还没有找到一个解决这个问题的方法。

查看正在生成的代码(其中正在抛出NPE):

    进口org.apache.spark.sql.execution.debug._
  • 在DataSet/DataFrame上调用.debugCodegen()

这个问题与Teradata JDBC驱动程序有关。这个问题在https://community.teradata.com/t5/Connectivity/Teradata-JDBC-Driver-returns-the-wrong-schema-column-nullability/m-p/76667/highlight/true#M3798讨论。

在第一页讨论了根本原因。答案在第三页。

Teradata的人说他们在16.10中修复了这个问题。*驱动程序与MAYBENULL参数,但我仍然看到一个不确定的行为。

这里有一个类似的讨论https://issues.apache.org/jira/browse/SPARK-17195

如果其他人仍在寻找解决方案,您可以在引起问题的列上使用NULLIF,这些问题是由JOIN导致指定模式中最初为not null的列的空值引起的。

相关JIRA: https://issues.apache.org/jira/browse/SPARK-18859

相关内容

  • 没有找到相关文章

最新更新