错误执行程序:阶段1.0中任务1.0出现异常(TID 1)java.net.NoRouteToHostException



每次出现此错误时,我都试图运行单词计数火花应用程序。请帮助,以下是wordcount.scala文件,在sbt包之后,我运行了spark-submit命令

package main
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
object WordCount {
  def main(args: Array[String]) {
    val conf = new SparkConf().setAppName("Word Count")
    val sc = new SparkContext(conf)
    val textfile = sc.textFile("file:///usr/local/spark/README.md")
    val tokenizeddata = textfile.flatMap(line => line.split(" "))
    val countprep = tokenizeddata.map(word => (word,1))
    val counts = countprep.reduceByKey((accumvalue,newvalue)=>(accumvalue+newvalue))
    val sortedcount = counts.sortBy(kvpair=>kvpair._2,false)
    sortedcount.saveAsTextFile("file:///usr/local/wordcount")
  }
}    

我执行下一个命令。

 bin/spark-submit --class "main.WordCount" --master "local[*]" "/home/hadoop/SparkApps/target/scala-2.10/word-count_2.10-1.0.jar"

Spark程序集已使用Hive构建,包括上的Datanucleus jarclasspath Java HotSpot(TM)64位服务器VM警告:

忽略选项MaxPermSize=128m;8.0中删除了支持15/11/28 07:38:51错误执行程序:1.0阶段任务1.0中出现异常

(TID 1)java.net.NoRouteToHostException:没有到主机的路由位于java.net.PlainSocketImpl.socketConnect(本机方法)位于java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSOCetImpl.java:350)位于java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSOCetImpl.java:206)位于java.net.AbstractPlainSocketImpl.connect(AbstractPlainSOCetImpl.java:188)位于java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)位于java.net.Socket.connect(Socket.java:589)位于sun.net.NetworkClient.doConnect(NetworkClient.java:175)网址:sun.net。www.http.HttpClient.openServer(HttpClient.java:432)网址:sun.net。www.http.HttpClient.openServer(HttpClient.java:527)网址:sun.net。www.http.HttpClient.(HttpClient.java:211)网址:sun.net。www.http.HttpClient.New(HttpClient.java:308)网址:sun.net。www.http.HttpClient.New(HttpClient.java:326)网址:sun.net。www.protocol.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:1169)网址:sun.net。www.protocol.HttpURLConnection.plainConnect0(HttpURLConnection.java:1105)网址:sun.net。www.protocol.HttpURLConnection.plainConnect(HttpURLConnection.java:999)网址:sun.net。www.protocol.HttpURLConnection.connect(HttpURLConnection.java:933)网址:org.apache.spark.util.Utils$.fetchFile(Utils.scala:375)在org.apache.spark.executor.executor$$anonfun$org.apache$spark$executor$executor$$updateDependencies$6.apply(executor.scala:325)在org.apache.spark.executor.executor$$anonfun$org.apache$spark$executor$executor$$updateDependencies$6.apply(executor.scala:323)在scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:772)位于scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)位于scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)位于scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)位于scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)位于scala.collection.mutable.HashMap.foreach(HashMap.scala:98)位于scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:771)在org.apache.spark.executor.executor.org.apache$spark$executor$executor$$updateDependencies(executor.scala:323)网址:org.apache.spark.executor.executor$TaskRunner.run(executor.scala:158)位于java.util.concurrent.ThreadPoolExecutiator.runWorker(ThreadPoolExecutiator.java:1142)位于java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)在java.lang.Thread.run(Thread.java:745)

也许您应该添加.setMaster("local")

最新更新