使用 Spark GraphX 时 INT/LONG 转换的奇怪错误



这里是Scala的新开发人员,也是Spark GraphX的新用户。到目前为止,我真的很享受我的时间,但我刚刚遇到了一个非常奇怪的错误。我已经将问题隔离为长期转换,但这真的很奇怪。另一个奇怪的事情是它在 Windows 中工作正常,但在 Linux 中不起作用(创建一个无限循环(我已经在 Linux 中找到了问题的根源,但我不明白为什么会有问题。我必须首先将随机数放入变量中,然后它就可以工作了。

您应该能够复制/粘贴并执行整个事情

Scala 2.10.6, Spark 2.1.0, Linux Ubuntu 16.04

 import org.apache.spark.{SparkConf, SparkContext}
  import org.apache.spark.graphx._
  import scala.util.Random
object Main extends App {
  //Fonction template pour imprimer n'importe quel graphe
  def printGraph[VD,ED] ( g : Graph[VD,ED] ): Unit = {
    g.vertices.collect.foreach( println )
  }
  def randomNumber(limit : Int) = {
    val start = 1
    val end   = limit
    val rnd = new Random
    start + rnd.nextInt( (end - start) + 1 )
  }
  val conf = new SparkConf()
    .setAppName("Simple Application")
    .setMaster("local[*]")
  val sc = new SparkContext(conf)
  sc.setLogLevel("ERROR")
  val myVertices = sc.makeRDD(Array((1L, "A"), (2L, "B"), (3L, "C"), (4L, "D"), (5L, "E"), (6L, "F")))
  val myEdges = sc.makeRDD(Array(Edge(1L, 2L, ""),
    Edge(1L, 3L, ""), Edge(1L, 6L, ""), Edge(2L, 3L, ""),
    Edge(2L, 4L, ""), Edge(2L, 5L, ""), Edge(3L, 5L, ""),
    Edge(4L, 6L, ""), Edge(5L, 6L, "")))
  val myGraph = Graph(myVertices, myEdges)
  //Add a random color to each vertice. This random color is chosen from the total number of vertices
  //Transform vertex attribute to color only
  val bug = myVertices.count()
  println("Long : " + bug)
  val bugInt = bug.toInt
  println("Int : " + bugInt)
  //Problem is here when adding myGraph.vertices.count().toInt inside randomNumber. Works on Windows, infinite loop on Linux.
  val g2 = myGraph.mapVertices( ( id, name  ) => ( randomNumber(myGraph.vertices.count().toInt) ))
 //Rest of code removed

}

不确定您是在寻找解决方案还是根本原因。我认为mapVertices方法都在干扰count(一个是转换,一个是操作(。

解决方案将是

val lim = myGraph.vertices.count().toInt
val g2 = myGraph.mapVertices( ( id, name  ) => ( randomNumber(lim) ))

最新更新