打开的文件太多 卡夫卡 长时间运行异常



我在Java中有一个Kafka生产者代码,它使用java nio WatchService api监视新文件的目录,并获取任何新文件并推送到kafka主题。Spark 流式处理使用者从 kafka 主题读取。在 Kafka 生产者作业运行一天后,我收到以下错误。制作者每 2 分钟推送大约 500 个文件。我的 Kafka 主题有 1 个分区和 2 个复制因子。有人可以帮忙吗?

org.apache.kafka.common.KafkaException: Failed to construct kafka producer         
    at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:342) 
    at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:166) 
    at com.hp.hawkeye.HawkeyeKafkaProducer.Sender.createProducer(Sender.java:60) 
    at com.hp.hawkeye.HawkeyeKafkaProducer.Sender.<init>(Sender.java:38)   
    at com.hp.hawkeye.HawkeyeKafkaProducer.HawkeyeKafkaProducer.<init>(HawkeyeKafkaProducer.java:54) 
    at com.hp.hawkeye.HawkeyeKafkaProducer.myKafkaTestJob.main(myKafkaTestJob.java:81)
Caused by: org.apache.kafka.common.KafkaException: java.io.IOException: Too many open files
    at org.apache.kafka.common.network.Selector.<init>(Selector.java:125)
    at org.apache.kafka.common.network.Selector.<init>(Selector.java:147)  
    at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:306)
... 7 more 
Caused by: java.io.IOException: Too many open files         
     at sun.nio.ch.EPollArrayWrapper.epollCreate(Native Method)         
     at sun.nio.ch.EPollArrayWrapper.<init>(EPollArrayWrapper.java:130)        
     at sun.nio.ch.EPollSelectorImpl.<init>(EPollSelectorImpl.java:69)      
     at sun.nio.ch.EPollSelectorProvider.openSelector(EPollSelectorProvider.java:36) 
     at java.nio.channels.Selector.open(Selector.java:227)         
     at org.apache.kafka.common.network.Selector.<init>(Selector.java:123)     
 ... 9 more

检查ulimit -aH

请咨询您的管理员并增加打开的文件大小,例如:

open files                      (-n) 655536

否则我怀疑您的代码中可能存在泄漏,请参阅:

http://mail-archives.apache.org/mod_mbox/spark-user/201504.mbox/%3CCAKWX9VVJZObU9omOVCfPaJ_bPAJWiHcxeE7RyeqxUHPWvfj7WA@mail.gmail.com%3E

最新更新