Start your 14-day free trial today

No Credit Card Required

Try Logit.io Free

Already have an account? Sign In

Send data via Logstash Logging to your Logstash instance provided by Logit.io

Logstash Logging Configuration

Ship logs to your hosted Logstash instance at Logit

Filebeat is an open source shipping agent that lets you ship logs from local files to one or more destinations, including Logstash.

Step 1 - Install FilebeatCopy

deb (Debian/Ubuntu/Mint)

curl -L -O https://artifacts.elastic.co/downloads/beats//-oss-7.15.1-amd64.deb
sudo dpkg -i -oss-7.15.1-amd64.deb

rpm (CentOS/RHEL/Fedora)

curl -L -O https://artifacts.elastic.co/downloads/beats//-oss-7.15.1-x86_64.rpm
sudo rpm -vi -oss-7.15.1-x86_64.rpm

macOS

curl -L -O https://artifacts.elastic.co/downloads/beats//-oss-7.15.1-darwin-x86_64.tar.gz
tar xzvf -oss-7.15.1-darwin-x86_64.tar.gz

Windows

  • Download and extract the Windows zip file.
  • Rename the -<version>-windows directory to ``.
  • Open a PowerShell prompt as an Administrator.
  • Run the following to install as a Windows service:
.\install-service-.ps1
If script execution is disabled on your system, you need to set the execution policy for the current session to allow the script to run. For example: PowerShell.exe -ExecutionPolicy UnRestricted -File .\install-service-.ps1.
My OS isn't here! Chat to support now

Step 2 - Enable the Logstash ModuleCopy

There are several built in filebeat modules you can use. You will need to enable the Logstash module.

deb/rpm

sudo filebeat modules list
sudo filebeat modules enable logstash

macOS

cd <EXTRACTED_ARCHIVE>
./filebeat modules list
./filebeat modules enable logstash

Windows

cd <EXTRACTED_ARCHIVE>
.\filebeat.exe modules list
.\filebeat.exe modules enable logstash

Additional module configuration can be done using the per module config files located in the modules.d folder, most commonly this would be to read logs from a non-default location

deb/rpm /etc/filebeat/modules.d/
mac/win <EXTRACTED_ARCHIVE>/modules.d/

- module: logstash
  # logs
  log:
    enabled: true

    # Set custom paths for the log files. If left empty,
    # Filebeat will choose the paths depending on your OS.
    var.paths: ["/custom/path/to/logs"]

  # Slow logs
  slowlog:
    enabled: true
    # Set custom paths for the log files. If left empty,
    # Filebeat will choose the paths depending on your OS.
    var.paths: ["/custom/path/to/logs"]

Step 3 - Copy Configuration FileCopy

The configuration file below is pre-configured to send data to your Logit.io Stack via Logstash.

Copy the configuration file below and overwrite the contents of filebeat.yml.

# ============================== Filebeat modules ==============================

filebeat.config.modules:
  # Glob pattern for configuration loading
  path: ${path.config}/modules.d/*.yml

  # Set to true to enable config reloading
  reload.enabled: false

  # Period on which files under path should be checked for changes
  #reload.period: 10s

# ======================= Elasticsearch template setting =======================

setup.template.settings:
  index.number_of_shards: 1
  #index.codec: best_compression
  #_source.enabled: false

# ================================== Outputs ===================================
# ------------------------------ Logstash Output -------------------------------
<div class="sw-warning">
    <b>No input available! </b> Your stack is missing the required input for this data source <a href="#" onclick="Intercom('showNewMessage')" class="btn btn-info btn-sm">Talk to support to add the input</a>
</div> 

Step 4 - Start FilebeatCopy

Ok, time to start ingesting data!

deb/rpm

sudo systemctl enable filebeat
sudo systemctl start filebeat

macOS

./filebeat

Windows

PS C:\Program Files\Filebeat> Start-Service filebeat

Step 5 - how to diagnose no data in StackCopy

If you don't see data appearing in your Stack after following the steps, visit the Help Centre guide for steps to diagnose no data appearing in your Stack or Chat to support now.

Step 6 - Logstash OverviewCopy

What Is Logstash?

Logstash is an open-source light-weight processing pipeline created by Elastic. It is the most popular data pipeline used for Elasticsearch as their close integration allows for powerful log processing capabilities.

What is Logstash Used For?

Logstash makes up an essential part of the ELK Stack as it provides the easiest solution for collecting, parsing, and storing logs for analysis & monitoring.

Logstash’s wide range of over 200 open-source plugins help you easily index your data across multiple log types including error, event & web server files from hundreds of popular integrations. The tool offers an array of input, output, & filter plugins for enriching, and transforming data from a variety of sources including Golang, Google Cloud, & Azure.

What Are The Advantages Of Using Logstash?

Some of the advantages of using this pipeline include high availability & flexibility, due to the wide adoption of ELK & the large community supporting their regularly maintained plugins. Some of the most popular plugins include Logtrailing (for tailing live events), Beats (including Metricbeat & Heartbeat), & TCP.

Logstash also benefits from having a straightforward configuration format that reduces the complexity of getting started with ELK. Data ingestion using Logstash is enriched prior to being indexed by Elasticsearch, making it readily available for filtering, analysis & reporting in Kibana once data is migrated to your ELK Stack

Toggle View

Expand View

Return to Search

© 2022 Logit.io Ltd, All rights reserved.