我使用的是Elasticsearch-1.5.1, Kibana-4.0.2-linux-x86, Logstash-1.4.2。我的logstash conf是这样的
input{
redis{
data_type=>'list'
key=>'pace'
password=>'bhushan'
type=>pace
}
}filter {
geoip {
source => "mdc.ip"
target => "geoip"
database => "/opt/logstash-1.4.2/vendor/geoip/GeoLiteCity.dat"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}
}
output{
if[type]=="pace"{
elasticsearch{
template_overwrite => true
host=>localhost
index=>'pace'
template => "/opt/logstash-1.4.2/mytemplates/elasticsearch-template.json"
template_name => "bhushan"
}
}
stdout{
codec=>rubydebug
}
}
我的 elasticsearch-template。json
{
"template" : "bhushan",
"settings" : {
"index.refresh_interval" : "5s"
},
"mappings" : {
"_default_" : {
"_all" : {"enabled" : true},
"dynamic_templates" : [ {
"string_fields" : {
"match" : "*",
"match_mapping_type" : "string",
"mapping" : {
"type" : "string", "index" : "analyzed", "omit_norms" : true,
"fields" : {
"raw" : {"type": "string", "index" : "not_analyzed", "ignore_above" : 256}
}
}
}
} ],
"properties" : {
"@version": { "type": "string", "index": "not_analyzed" },
"geoip" : {
"type" : "object",
"dynamic": true
"properties" : {
"location" : { "type" : "geo_point" }
}
}
}
}
}
}
当我做url curl http://localhost:9200/pace/_mapping/pace/field/geoip.location?pretty
{
"pace" : {
"mappings" : {
"pace" : {
"geoip.location" : {
"full_name" : "geoip.location",
"mapping" : {
"location" : {
"type" : "double"
}
}
}
}
}
}
}
日志记录的示例如下:
{
"thread_name" => "main",
"mdc.ip" => "14.X.X.X",
"message" => "Hii, I m in info",
"@timestamp" => "2015-05-15T10:18:32.904+05:30",
"level" => "INFO",
"file" => "Test.java",
"class" => "the.bhushan.log.test.Test",
"line_number" => "15",
"logger_name" => "bhushan",
"method" => "main",
"@version" => "1",
"type" => "pace",
"geoip" => {
"ip" => "14.X.X.X",
"country_code2" => "IN",
"country_code3" => "IND",
"country_name" => "India",
"continent_code" => "AS",
"region_name" => "16",
"city_name" => "Mumbai",
"latitude" => 18.974999999999994,
"longitude" => 72.82579999999999,
"timezone" => "Asia/Calcutta",
"real_region_name" => "Maharashtra",
"location" => [
[0] 72.82579999999999,
[1] 18.974999999999994
],
"coordinates" => [
[0] "72.82579999999999",
[1] "18.974999999999994"
]
}
}
我认为我的问题与此相同,所以我做了该链接中提到的一切,如删除所有旧索引和重新启动LS和ES,但没有运气。
您的logstash过滤器将坐标存储在字段geoip.coordinates
中,但是在您的elasticsearch-template.json
映射中该字段称为geoip.location
。这显示在您的示例日志记录中,您可以在geoip
子对象中看到两个字段location
和coordinates
。
我想如果你在你的logstash过滤器中改变这个,你可能会很好:
从这个add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
add_field => [ "[geoip][location]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][location]", "%{[geoip][latitude]}" ]
-
geoip
过滤器中的两个add_field
指令可以被删除,因为它们是不必要的 -
"path": "full"
可以被删除,因为它从ES v1.0开始就被弃用了 - 模板名称应该是
pace
而不是bushan
,即存储日志记录的索引名称。