Send data via Filebeat to your Logstash instance provided by


A log shipper designed for files.

Filebeat is an open source shipping agent that lets you ship logs from local files to one or more destinations, including Logstash.

Step 1 - Install Filebeat

deb (Debian/Ubuntu/Mint)

curl -L -O
sudo dpkg -i filebeat-oss-7.6.2-amd64.deb

rpm (CentOS/RHEL/Fedora)

curl -L -O
sudo rpm -vi filebeat-oss-7.6.2-x86_64.rpm


curl -L -O 
tar xzvf filebeat-oss-7.6.2-darwin-x86_64.tar.gz


  • Download the Filebeat Windows zip file from the official downloads page.
  • Extract the contents of the zip file into C:\Program Files.
  • Rename the filebeat-<version>-windows directory to Filebeat.
  • Open a PowerShell prompt as an Administrator (right-click the PowerShell icon and select Run As Administrator). If you are running Windows XP, you may need to download and install PowerShell.
  • Run the following commands to install Filebeat as a Windows service:
PS > cd 'C:\Program Files\Filebeat'
PS C:\Program Files\Filebeat> .\install-service-filebeat.ps1`
If script execution is disabled on your system, you need to set the execution policy for the current session to allow the script to run. For example: PowerShell.exe -ExecutionPolicy UnRestricted -File .\install-service-filebeat.ps1.
My OS isn't here! Don't see your system? Check out the official downloads page for more options (including 32-bit versions).

Step 2 - Locate the configuration file

deb/rpm /etc/filebeat/filebeat.yml
mac/win <EXTRACTED_ARCHIVE>/filebeat.yml

Step 3 - Configure the inputs

Setup the data you wish to send us, by editing the input path variables.
These fully support wildcards. You can also add a document type.
An example with nginx logs might look like


- type: log
#change to true to enable this input configuration
  enabled: true
    - /var/log/nginx/*.log
    type: nginx-access
  fields_under_root: true
  encoding: utf-8
  exclude_files: [".gz"]
  ignore_older: 3h
There's also a full example configuration file called filebeat.reference.yml that shows all the possible options.

Step 4 - Configure Modules (Optional)

Filebeat also has modules that can be displayed, enabled or disabled using


sudo filebeat modules list
sudo filebeat modules enable <module name>
sudo filebeat modules disable <module name>


./filebeat modules list
./filebeat modules enable <module name>
./filebeat modules disable <module name>


filebeat.exe modules list
filebeat.exe modules enable <module name>
filebeat.exe modules disable <module name>

Additionally module configuration can be done using the per module config files located in the modules.d folder, most commonly this would be to read logs from a non-default location

deb/rpm /etc/filebeat/modules.d/
mac/win <EXTRACTED_ARCHIVE>/modules.d/

Step 5 - Configure output

We'll be shipping to Logstash so that we have the option to run filters before the data is indexed.
Comment out the elasticsearch output block.

## Comment out elasticsearch output
#  hosts: ["localhost:9200"]

Uncomment and change the logstash output to match below.

    hosts: ["your-logstash-host:your-port"]
    loadbalance: true
    ssl.enabled: true

Step 6 - Validate configuration

Let's check the configuration file is syntactically correct.


sudo filebeat -e -c /etc/filebeat/filebeat.yml


./filebeat -e -c filebeat.yml


filebeat.exe -e -c filebeat.yml

Step 7 - Start filebeat

Ok, time to start ingesting data!


sudo systemctl enable filebeat
sudo systemctl start filebeat




Start-Service filebeat

Step 8 - Filebeat Overview

Filebeat is well known for being the most popular lightweight log shipper for sending logs to the Elastic Stack due to its reliability & minimal memory footprint. It is the leading Beat out of the entire collection of open-source shipping tools, including Auditbeat, Metricbeat & Heartbeat.

Filebeat forms the basis of the majority of ELK Stack based infrastructure. It’s origins begin from combining key features from Logstash-Forwarder & Lumberjack & is written in Go. Within the logging pipeline, Filebeat can generate, parse, tail & forward common logs to be indexed within Elasticsearch. The harvester is often compared to Logstash but it is not a suitable replacement & instead should be used in tandem for most use cases.

Earlier versions of Filebeat suffered from a very limited scope & only allowed the user to send events to Logstash & Elasticsearch. More recent versions of the shipper have been updated to be compatible with Redis & Kafka.

A misconfigured Filebeat setup can lead to many complex logging concerns that this filebeat.yml wizard aims to solve. Just a couple of examples of these include excessively large registry files & file handlers that error frequently when encountering deleted or renamed log files. Tracking numerous pipelines using this shipper can become tedious for self hosted Elastic Stacks so you may wish to consider our Hosted ELK service as a solution to this.

If you need any further assistance with migrating your Filebeat log data to the Elastic Stack we're here to help you get started. Feel free to get in contact with our support team by sending us a message via live chat & we'll be happy to assist.

expand view

Expand View

compact view

Compact View

Return to Search
Sign Up