我遇到了与这里描述的完全相同的错误。我的目标是使用Logstash将数据从Mongo集合读取到Elastic索引。
安装要做到这一点,我一直使用Docker来模拟ELK堆栈和MongoDB数据库。每个服务都在同一个docker网络elastic
中。
- MongoDB中未添加用户
- Logstash的设置是默认的。
- ELK栈版本为
7.14.0
。
- 我在这里下载了JDBC Mongo驱动程序:http://www.dbschema.com/jdbc-drivers/MongoDbJdbcDriver.zip并解压缩了
~/driver
中的压缩文件
管道配置
下面是管道配置:
input {
jdbc {
jdbc_driver_library => "/opt/logstash/mongo_drivers/mongojdbc3.1.jar"
jdbc_driver_class => "Java::com.dbschema.MongoJdbcDriver"
jdbc_connection_string => "jdbc:mongodb://mongo01-test:27017/my-database"
jdbc_user => ""
schedule => "0 * * * *"
statement => "db.items.find({});"
}
}
output {
elasticsearch {
hosts => ["es01-test:9200"]
index => "items-%{+YYYY.MM.dd}"
}
stdout { codec => rubydebug }
}
Docker运行命令
docker run --name log01-test --net elastic -v ~/pipeline:/usr/share/logstash/pipeline/ -v ~/driver/:/opt/logstash/mongo_drivers/ docker.elastic.co/logstash/logstash:7.14.0
输出日志OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/bundler-1.17.3/lib/bundler/rubygems_integration.rb:200: warning: constant Gem::ConfigMap is deprecated
Sending Logstash logs to /usr/share/logstash/logs which is now configured via log4j2.properties
[2021-08-06T15:16:02,238][INFO ][logstash.runner ] Log4j configuration path used is: /usr/share/logstash/config/log4j2.properties
[2021-08-06T15:16:02,250][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.14.0", "jruby.version"=>"jruby 9.2.19.0 (2.5.8) 2021-06-15 55810c552b OpenJDK 64-Bit Server VM 11.0.11+9 on 11.0.11+9 +indy +jit [linux-x86_64]"}
[2021-08-06T15:16:02,273][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.queue", :path=>"/usr/share/logstash/data/queue"}
[2021-08-06T15:16:02,289][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.dead_letter_queue", :path=>"/usr/share/logstash/data/dead_letter_queue"}
[2021-08-06T15:16:02,777][INFO ][logstash.agent ] No persistent UUID file found. Generating new UUID {:uuid=>"0dc63ab4-c361-4822-922d-7d981780e3b3", :path=>"/usr/share/logstash/data/uuid"}
[2021-08-06T15:16:03,832][WARN ][logstash.monitoringextension.pipelineregisterhook] xpack.monitoring.enabled has not been defined, but found elasticsearch configuration. Please explicitly set `xpack.monitoring.enabled: true` in logstash.yml
[2021-08-06T15:16:03,836][WARN ][deprecation.logstash.monitoringextension.pipelineregisterhook] Internal collectors option for Logstash monitoring is deprecated and targeted for removal in the next major version.
Please configure Metricbeat to monitor Logstash. Documentation can be found at:
https://www.elastic.co/guide/en/logstash/current/monitoring-with-metricbeat.html
[2021-08-06T15:16:04,368][WARN ][deprecation.logstash.outputs.elasticsearch] Relying on default value of `pipeline.ecs_compatibility`, which may change in a future major release of Logstash. To avoid unexpected changes when upgrading Logstash, please explicitly declare your desired ECS Compatibility mode.
[2021-08-06T15:16:04,980][INFO ][logstash.licensechecker.licensereader] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elasticsearch:9200/]}}
[2021-08-06T15:16:05,191][WARN ][logstash.licensechecker.licensereader] Attempted to resurrect connection to dead ES instance, but got an error {:url=>"http://elasticsearch:9200/", :exception=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError, :message=>"Elasticsearch Unreachable: [http://elasticsearch:9200/][Manticore::ResolutionFailure] elasticsearch: Name or service not known"}
[2021-08-06T15:16:05,239][WARN ][logstash.licensechecker.licensereader] Marking url as dead. Last error: [LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError] Elasticsearch Unreachable: [http://elasticsearch:9200/][Manticore::ResolutionFailure] elasticsearch {:url=>http://elasticsearch:9200/, :error_message=>"Elasticsearch Unreachable: [http://elasticsearch:9200/][Manticore::ResolutionFailure] elasticsearch", :error_class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError"}
[2021-08-06T15:16:05,248][ERROR][logstash.licensechecker.licensereader] Unable to retrieve license information from license server {:message=>"Elasticsearch Unreachable: [http://elasticsearch:9200/][Manticore::ResolutionFailure] elasticsearch"}
[2021-08-06T15:16:05,302][ERROR][logstash.monitoring.internalpipelinesource] Failed to fetch X-Pack information from Elasticsearch. This is likely due to failure to reach a live Elasticsearch cluster.
[2021-08-06T15:16:05,696][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2021-08-06T15:16:06,615][INFO ][org.reflections.Reflections] Reflections took 96 ms to scan 1 urls, producing 120 keys and 417 values
[2021-08-06T15:16:07,771][WARN ][deprecation.logstash.inputs.jdbc] Relying on default value of `pipeline.ecs_compatibility`, which may change in a future major release of Logstash. To avoid unexpected changes when upgrading Logstash, please explicitly declare your desired ECS Compatibility mode.
[2021-08-06T15:16:07,882][WARN ][deprecation.logstash.outputs.elasticsearch] Relying on default value of `pipeline.ecs_compatibility`, which may change in a future major release of Logstash. To avoid unexpected changes when upgrading Logstash, please explicitly declare your desired ECS Compatibility mode.
[2021-08-06T15:16:08,003][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//es01-test:9200"]}
[2021-08-06T15:16:08,030][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://es01-test:9200/]}}
[2021-08-06T15:16:08,082][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://es01-test:9200/"}
[2021-08-06T15:16:08,139][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (7.14.0) {:es_version=>7}
[2021-08-06T15:16:08,143][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2021-08-06T15:16:08,298][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled}
[2021-08-06T15:16:08,408][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["/usr/share/logstash/pipeline/test-logstash.conf"], :thread=>"#<Thread:0x3cff6ed run>"}
[2021-08-06T15:16:09,644][INFO ][logstash.javapipeline ][main] Pipeline Java execution initialization time {"seconds"=>1.23}
[2021-08-06T15:16:09,713][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
[2021-08-06T15:16:09,789][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
java.nio.file.NoSuchFileException: /usr/share/logstash/.DbSchema/logs/MongoDbJdbcDriver.log.lck
at java.base/sun.nio.fs.UnixException.translateToIOException(UnixException.java:92)
at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:111)
at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:116)
at java.base/sun.nio.fs.UnixFileSystemProvider.newFileChannel(UnixFileSystemProvider.java:182)
at java.base/java.nio.channels.FileChannel.open(FileChannel.java:292)
at java.base/java.nio.channels.FileChannel.open(FileChannel.java:345)
at java.logging/java.util.logging.FileHandler.openFiles(FileHandler.java:511)
at java.logging/java.util.logging.FileHandler.<init>(FileHandler.java:307)
at com.dbschema.MongoJdbcDriver.<clinit>(MongoJdbcDriver.java:37)
at java.base/java.lang.Class.forName0(Native Method)
at java.base/java.lang.Class.forName(Class.java:398)
at org.jruby.javasupport.JavaSupportImpl.loadJavaClass(JavaSupportImpl.java:157)
at org.jruby.javasupport.Java.getProxyClassOrNull(Java.java:961)
at org.jruby.javasupport.Java.getProxyClassOrNull(Java.java:948)
at org.jruby.javasupport.Java.getProxyOrPackageUnderPackage(Java.java:905)
at org.jruby.javasupport.JavaPackage.method_missing(JavaPackage.java:252)
at org.jruby.javasupport.JavaPackage$INVOKER$i$method_missing.call(JavaPackage$INVOKER$i$method_missing.gen)
at org.jruby.internal.runtime.methods.JavaMethod$JavaMethodN.call(JavaMethod.java:833)
at org.jruby.runtime.Helpers$MethodMissingMethod.call(Helpers.java:591)
at org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:196)
at org.jruby.runtime.callsite.CachingCallSite.callMethodMissing(CachingCallSite.java:440)
at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:352)
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:144)
at org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:345)
at org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:72)
at org.jruby.ir.interpreter.Interpreter.INTERPRET_EVAL(Interpreter.java:106)
at org.jruby.ir.interpreter.Interpreter.evalCommon(Interpreter.java:158)
at org.jruby.ir.interpreter.Interpreter.evalWithBinding(Interpreter.java:182)
at org.jruby.RubyKernel.evalCommon(RubyKernel.java:1086)
at org.jruby.RubyKernel.eval(RubyKernel.java:1048)
at org.jruby.RubyKernel$INVOKER$s$0$3$eval.call(RubyKernel$INVOKER$s$0$3$eval.gen)
at org.jruby.ir.targets.InvokeSite.invoke(InvokeSite.java:207)
at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.sequel_minus_5_dot_45_dot_0.lib.sequel.adapters.jdbc.RUBY$method$load_driver$0(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.45.0/lib/sequel/adapters/jdbc.rb:55)
at org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:80)
at org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:70)
at org.jruby.ir.targets.InvokeSite.invoke(InvokeSite.java:207)
at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_integration_minus_jdbc_minus_5_dot_1_dot_4.lib.logstash.plugin_mixins.jdbc.common.RUBY$method$load_driver$0(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.1.4/lib/logstash/plugin_mixins/jdbc/common.rb:27)
at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_integration_minus_jdbc_minus_5_dot_1_dot_4.lib.logstash.plugin_mixins.jdbc.common.RUBY$method$load_driver$0$__VARARGS__(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.1.4/lib/logstash/plugin_mixins/jdbc/common.rb)
at org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:80)
at org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:70)
at org.jruby.ir.targets.InvokeSite.invoke(InvokeSite.java:207)
at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_integration_minus_jdbc_minus_5_dot_1_dot_4.lib.logstash.inputs.jdbc.RUBY$method$run$0(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.1.4/lib/logstash/inputs/jdbc.rb:292)
at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_integration_minus_jdbc_minus_5_dot_1_dot_4.lib.logstash.inputs.jdbc.RUBY$method$run$0$__VARARGS__(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.1.4/lib/logstash/inputs/jdbc.rb)
at org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:80)
at org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:70)
at org.jruby.ir.targets.InvokeSite.invoke(InvokeSite.java:207)
at usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$inputworker$0(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:405)
at usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$inputworker$0$__VARARGS__(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb)
at org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:80)
at org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:70)
at org.jruby.ir.targets.InvokeSite.invoke(InvokeSite.java:207)
at usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.RUBY$block$start_input$1(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:396)
at org.jruby.runtime.CompiledIRBlockBody.callDirect(CompiledIRBlockBody.java:138)
at org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:58)
at org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:52)
at org.jruby.runtime.Block.call(Block.java:139)
at org.jruby.RubyProc.call(RubyProc.java:318)
at org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:105)
at java.base/java.lang.Thread.run(Thread.java:829)
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/rufus-scheduler-3.0.9/lib/rufus/scheduler/cronline.rb:77: warning: constant ::Fixnum is deprecated
[2021-08-06T15:16:35,282][ERROR][logstash.licensechecker.licensereader] Unable to retrieve license information from license server {:message=>"No Available connections"}
[2021-08-06T15:16:35,633][WARN ][logstash.licensechecker.licensereader] Attempted to resurrect connection to dead ES instance, but got an error {:url=>"http://elasticsearch:9200/", :exception=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError, :message=>"Elasticsearch Unreachable: [http://elasticsearch:9200/][Manticore::ResolutionFailure] elasticsearch: Name or service not known"}
^C[2021-08-06T15:16:39,600][WARN ][logstash.runner ] SIGINT received. Shutting down.
[2021-08-06T15:16:39,623][FATAL][logstash.runner ] SIGINT received. Terminating immediately..
[2021-08-06T15:16:39,646][FATAL][org.logstash.Logstash ]
org.jruby.exceptions.ThreadKill: null
- 然后,我注意到从未创建目录
/usr/share/logstash/.DbSchema/
。是否需要配置Logstash设置来解决这个问题?
任何关于这个话题的帮助将是非常感激的,非常感谢!
根本原因
来自于MongoDbJdbcDriver类的源代码,在这里的静态初始化器处。
在有问题的第37行下面:
final FileHandler fileHandler = new FileHandler(System.getProperty("user.home") + "/.DbSchema/logs/MongoDbJdbcDriver.log");
它只在目录~/.DbSchema/logs
存在的情况下工作,而在运行logstash容器时则不是这样。
- 使用其他适合JDBC的MongoDB驱动程序
- 或者从Logstash的镜像创建一个docker镜像,内容如下:
FROM docker.elastic.co/logstash/logstash:7.14.0
RUN mkdir -p /usr/share/logstash/.DbSchema/log