Already have an account? Sign In
Filebeat is an open source shipping agent that lets you ship logs from local files to one or more destinations, including Logstash.
curl -L -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-oss-7.15.1-amd64.deb sudo dpkg -i filebeat-oss-7.15.1-amd64.deb
curl -L -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-oss-7.15.1-x86_64.rpm sudo rpm -vi filebeat-oss-7.15.1-x86_64.rpm
curl -L -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-oss-7.15.1-darwin-x86_64.tar.gz tar xzvf filebeat-oss-7.15.1-darwin-x86_64.tar.gz
- Download the Filebeat Windows zip file from the official downloads page.
- Extract the contents of the zip file into C:\Program Files.
- Rename the
- Open a PowerShell prompt as an Administrator (right-click the PowerShell icon and select Run As Administrator). If you are running Windows XP, you may need to download and install PowerShell.
- Run the following commands to install Filebeat as a Windows service:
cd 'C:\Program Files\Filebeat' .\install-service-filebeat.ps1`
PowerShell.exe -ExecutionPolicy UnRestricted -File .\install-service-filebeat.ps1.
Configure the paths you wish to ship, by editing the input path variables. These fully support wildcards and can also include a document type.
The default filebeat.inputs section looks as below
filebeat.inputs: - type: log # Change to true to enable this input configuration. enabled: false # Paths that should be crawled and fetched. Glob based paths. paths: - /var/log/*.log
An example with NGINX logs might look like the following
filebeat.inputs: - type: log # Change to true to enable this input configuration. enabled: true paths: - /var/log/nginx/*.log fields: type: nginx-access fields_under_root: true encoding: utf-8 exclude_files: [".gz"] ignore_older: 3h
Filebeat also has modules that can be displayed, enabled or disabled using
sudo filebeat modules list sudo filebeat modules enable <module name> sudo filebeat modules disable <module name>
cd <EXTRACTED_ARCHIVE> ./filebeat modules list ./filebeat modules enable <module name> ./filebeat modules disable <module name>
cd <EXTRACTED_ARCHIVE> filebeat.exe modules list filebeat.exe modules enable <module name> filebeat.exe modules disable <module name>
Additionally module configuration can be done using the per module config files located in the modules.d folder, most commonly this would be to read logs from a non-default location
We'll be shipping to Logstash so that we have the option to run filters before the data is indexed.
Comment out the elasticsearch output block.
## Comment out elasticsearch output #output.elasticsearch: # hosts: ["localhost:9200"]
Let's check the configuration file is syntactically correct by running directly inside the terminal.
If the file is invalid, will print an
error loading config file error message with details on how to correct the problem.
sudo -e -c /etc//.yml
cd <EXTRACTED_ARCHIVE> ./ -e -c .yml
cd <EXTRACTED_ARCHIVE> .\.exe -e -c .yml
Ok, time to start ingesting data!
sudo systemctl enable filebeat sudo systemctl start filebeat
Filebeat is the most popular way to send logs to ELK due to its reliability & minimal memory footprint. It is the leading Beat out of the entire collection of open-source shipping tools, including Auditbeat, Metricbeat & Heartbeat.
Filebeat's origins begin from combining key features from Logstash-Forwarder & Lumberjack & is written in Go. Within the logging pipeline, Filebeat can generate, parse, tail & forward common logs to be indexed within Elasticsearch. The harvester is often compared to Logstash but it is not a suitable replacement & instead should be used in tandem for most use cases.
Earlier versions of Filebeat suffered from a very limited scope & only allowed the user to send events to Logstash & Elasticsearch. More recent versions of the shipper have been updated to be compatible with Redis & Kafka.
A misconfigured Filebeat setup can lead to many complex logging concerns that this filebeat.yml wizard aims to solve. Just a couple of examples of these include excessively large registry files & file handlers that error frequently when encountering deleted or renamed log files. Tracking numerous pipelines using this shipper can become tedious for self hosted Elastic Stacks so you may wish to consider our Hosted ELK service as a solution to this.
If you need any further assistance with migrating your log data to ELK we're here to help you get started. Feel free to get in contact with our support team by sending us a message via live chat & we'll be happy to assist.