Have an account? Sign in
Filebeat is an open source shipping agent that lets you ship logs from local files to one or more destinations, including Logstash.
curl -L -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-oss-7.8.1-amd64.deb sudo dpkg -i filebeat-oss-7.8.1-amd64.deb
curl -L -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-oss-7.8.1-x86_64.rpm sudo rpm -vi filebeat-oss-7.8.1-x86_64.rpm
curl -L -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-oss-7.8.1-darwin-x86_64.tar.gz tar xzvf filebeat-oss-7.8.1-darwin-x86_64.tar.gz
- Download the Filebeat Windows zip file from the official downloads page.
- Extract the contents of the zip file into C:\Program Files.
- Rename the
- Open a PowerShell prompt as an Administrator (right-click the PowerShell icon and select Run As Administrator). If you are running Windows XP, you may need to download and install PowerShell.
- Run the following commands to install Filebeat as a Windows service:
cd 'C:\Program Files\Filebeat' .\install-service-filebeat.ps1`
PowerShell.exe -ExecutionPolicy UnRestricted -File .\install-service-filebeat.ps1.
Setup the data you wish to send us, by editing the input path variables.
These fully support wildcards. You can also add a document type.
An example with nginx logs might look like
filebeat.inputs: - type: log #change to true to enable this input configuration enabled: true paths: - /var/log/nginx/*.log fields: type: nginx-access fields_under_root: true encoding: utf-8 exclude_files: [".gz"] ignore_older: 3h
Filebeat also has modules that can be displayed, enabled or disabled using
sudo filebeat modules list sudo filebeat modules enable <module name> sudo filebeat modules disable <module name>
cd <EXTRACTED_ARCHIVE> ./filebeat modules list ./filebeat modules enable <module name> ./filebeat modules disable <module name>
cd <EXTRACTED_ARCHIVE> filebeat.exe modules list filebeat.exe modules enable <module name> filebeat.exe modules disable <module name>
Additionally module configuration can be done using the per module config files located in the modules.d folder, most commonly this would be to read logs from a non-default location
We'll be shipping to Logstash so that we have the option to run filters before the data is indexed.
Comment out the elasticsearch output block.
## Comment out elasticsearch output #output.elasticsearch: # hosts: ["localhost:9200"]
Uncomment and change the logstash output to match below.
output.logstash: hosts: ["your-logstash-host:your-ssl-port"] loadbalance: true ssl.enabled: true
Let's check the configuration file is syntactically correct by running filebeat directly inside the terminal.
If the file is invalid, filebeat will print an
error loading config file error message with details on how to correct the problem.
sudo filebeat -e -c /etc/filebeat/filebeat.yml
cd <EXTRACTED_ARCHIVE> ./filebeat -e -c filebeat.yml
cd <EXTRACTED_ARCHIVE> .\filebeat.exe -e -c filebeat.yml
Ok, time to start ingesting data!
sudo systemctl enable filebeat sudo systemctl start filebeat
Filebeat is well known for being the most popular lightweight log shipper for sending logs to the Elastic Stack due to its reliability & minimal memory footprint. It is the leading Beat out of the entire collection of open-source shipping tools, including Auditbeat, Metricbeat & Heartbeat.
Filebeat forms the basis of the majority of ELK Stack based infrastructure. It’s origins begin from combining key features from Logstash-Forwarder & Lumberjack & is written in Go. Within the logging pipeline, Filebeat can generate, parse, tail & forward common logs to be indexed within Elasticsearch. The harvester is often compared to Logstash but it is not a suitable replacement & instead should be used in tandem for most use cases.
Earlier versions of Filebeat suffered from a very limited scope & only allowed the user to send events to Logstash & Elasticsearch. More recent versions of the shipper have been updated to be compatible with Redis & Kafka.
A misconfigured Filebeat setup can lead to many complex logging concerns that this filebeat.yml wizard aims to solve. Just a couple of examples of these include excessively large registry files & file handlers that error frequently when encountering deleted or renamed log files. Tracking numerous pipelines using this shipper can become tedious for self hosted Elastic Stacks so you may wish to consider our Hosted ELK service as a solution to this.
If you need any further assistance with migrating your Filebeat log data to the Elastic Stack we're here to help you get started. Feel free to get in contact with our support team by sending us a message via live chat & we'll be happy to assist.