Start your 14-day free trial today & Get 20% Off All Annual Managed ELK Plans
No Credit Card Required
Try Logit.io FreeAlready have an account? Sign In
Kafka
Collect and ship Kafka application logs to Logstash and Elasticsearch.
Filebeat is a lightweight shipper that enables you to send your Apache Kafka application logs to Logstash and Elasticsearch. Configure Filebeat using the pre-defined examples below to start sending and analysing your Apache Kafka application logs.
Step 1 - Install Filebeat
deb (Debian/Ubuntu/Mint)
curl -L -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-oss-7.8.1-amd64.deb
sudo dpkg -i filebeat-oss-7.8.1-amd64.deb
rpm (CentOS/RHEL/Fedora)
curl -L -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-oss-7.8.1-x86_64.rpm
sudo rpm -vi filebeat-oss-7.8.1-x86_64.rpm
macOS
curl -L -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-oss-7.8.1-darwin-x86_64.tar.gz
tar xzvf filebeat-oss-7.8.1-darwin-x86_64.tar.gz
Windows
- Download the filebeat Windows zip file from the official downloads page.
- Extract the contents of the zip file into C:\Program Files.
- Rename the
filebeat-<version>-windows
directory tofilebeat
. - Open a PowerShell prompt as an Administrator (right-click the PowerShell icon and select Run As Administrator). If you are running Windows XP, you may need to download and install PowerShell.
- Run the following commands to install filebeat as a Windows service:
cd 'C:\Program Files\filebeat'
.\install-service-filebeat.ps1
PowerShell.exe -ExecutionPolicy UnRestricted -File .\install-service-filebeat.ps1
.
Step 2 - Enable module
There are several built in filebeat modules you can use. To enable the Kafka module, run the following:
deb/rpm
filebeat modules list
filebeat modules enable kafka
Windows
PS > .\filebeat.exe modules enable kafka
Step 3 - Locate configuration file
deb/rpm /etc/filebeat/filebeat.yml
win <EXTRACTED_ARCHIVE>/filebeat.yml
Step 4 - Configure Output
We'll be shipping to Logstash so that we have the option to run filters before the data is indexed.
Comment out the elasticsearch output block.
## Comment out elasticsearch output
#output.elasticsearch:
# hosts: ["localhost:9200"]
Uncomment and change the logstash output to match below.
output.logstash:
hosts: ["your-logstash-host:your-ssl-port"]
loadbalance: true
ssl.enabled: true
Step 5 - Validate configuration
Let's check the configuration file is syntactically correct by running filebeat directly inside the terminal.
If the file is invalid, filebeat will print an error loading config file
error message with details on how to correct the problem.
deb/rpm
sudo filebeat -e -c /etc/filebeat/filebeat.yml
macOS
cd <EXTRACTED_ARCHIVE>
./filebeat -e -c filebeat.yml
Windows
cd <EXTRACTED_ARCHIVE>
.\filebeat.exe -e -c filebeat.yml
Step 6 - (Optional) Update Logstash Filters
All Logit stacks come pre-configured with popular Logstash filters. We would recommend that you add Kafka specific filters if you don't already have them, to ensure enhanced dashboards and modules work correctly.
Edit your Logstash filters by choosing Stack > Settings > Logstash Filters
filter {
if [fileset][module] == "kafka" {
grok {
match => { "message" => "(?m)%{TIMESTAMP_ISO8601:[kafka][log][timestamp]}] %{LOGLEVEL:[kafka][log][level]} +%{JAVALOGMESSAGE:[kafka][log][message]} \(%{JAVACLASS:[kafka][log][class]}\)$[ \n]*(?'[kafka][log][trace][full]')" }
}
grok {
match => { "[kafka][log][message]" => "\[%{KAFKA_COMPONENT:[kafka][log][component]}\] +%{JAVALOGMESSAGE:[kafka][log][message]}" }
pattern_definitions => { "KAFKA_COMPONENT" => "[^\]]*" }
}
if "_grokparsefailure" in [tags] {
mutate { add_field => { "[kafka][log][component]" => "unknown" } }
}
grok {
match => { "[kafka][log][trace][full]" => "%{JAVACLASS:[kafka][log][trace][class]}:\s*%{JAVALOGMESSAGE:[kafka][log][trace][message]}" }
}
mutate { rename => { "@timestamp" => "read_timestamp" }
}
date { match => [ "[kafka][log][timestamp]", "yyyy-MM-dd HH:mm:ss,SSS" ]
target => "@timestamp"
}
}
}