Logstash Logging Configuration
Ship logs to your hosted Logstash instance at Logit
Filebeat is an open source shipping agent that lets you ship logs from local files to one or more destinations, including Logstash.
Follow this step by step guide to get 'logs' from your system to Logit.io:
Step 2 - Enable the Logstash Module
There are several built in filebeat modules you can use. You will need to enable the Logstash module.
deb/rpm
sudo filebeat modules list
sudo filebeat modules enable logstash
macOS
cd <EXTRACTED_ARCHIVE>
./filebeat modules list
./filebeat modules enable logstash
Windows
cd <EXTRACTED_ARCHIVE>
.\filebeat.exe modules list
.\filebeat.exe modules enable logstash
Additional module configuration can be done using the per module config files located in the modules.d folder, most commonly this would be to read logs from a non-default location
deb/rpm /etc/filebeat/modules.d/
mac/win <EXTRACTED_ARCHIVE>/modules.d/
- module: logstash
# logs
log:
enabled: true
# Set custom paths for the log files. If left empty,
# Filebeat will choose the paths depending on your OS.
var.paths: ["/custom/path/to/logs"]
# Slow logs
slowlog:
enabled: true
# Set custom paths for the log files. If left empty,
# Filebeat will choose the paths depending on your OS.
var.paths: ["/custom/path/to/logs"]
Step 3 - Update your configuration file
The configuration file below is pre-configured to send data to your Logit.io Stack via Logstash.
Copy the configuration file below and overwrite the contents of filebeat.yml.
# ============================== Filebeat modules ==============================
filebeat.config.modules:
path: ${path.config}/modules.d/*.yml
reload.enabled: false
#reload.period: 10s
# ================================== Outputs ===================================
# ------------------------------ Logstash Output -------------------------------
output.logstash:
hosts: ["your-logstash-host:your-ssl-port"]
loadbalance: true
ssl.enabled: true
# ================================= Processors =================================
processors:
- add_host_metadata:
when.not.contains.tags: forwarded
- add_cloud_metadata: ~
- add_docker_metadata: ~
- add_kubernetes_metadata: ~
If you’re running Filebeat 7
add this code block to the end. Otherwise, you can leave it out.
# ... For Filebeat 7 only ...
filebeat.registry.path: /var/lib/filebeat
If you’re running Filebeat 6
add this code block to the end. Otherwise, you can leave it out.
# ... For Filebeat 6 only ...
registry_file: /var/lib/filebeat/registry
Validate your YAML
It’s a good idea to run the configuration file through a YAML validator to rule out indentation errors, clean up extra characters, and check if your YAML file is valid. Yamllint.com is a great choice.
Step 4 - Validate configuration
If you have issues starting in the next step, you can use these commands below to troubleshoot.
Let's check the configuration file is syntactically correct by running directly inside the terminal.
If the file is invalid, will print an error loading config file
error message with details on how to correct the problem.
deb/rpm
sudo -e -c /etc//.yml
macOS
cd <EXTRACTED_ARCHIVE>
sudo ./ -e -c .yml
Windows
cd <EXTRACTED_ARCHIVE>
.\.exe -e -c .yml
Step 5 - Start filebeat
Start or restart to apply the configuration changes.
Step 6 - Check Logit.io for your logs
Now you should view your data:
If you don't see logs take a look at How to diagnose no data in Stack below for how to diagnose common issues.
Step 7 - how to diagnose no data in Stack
If you don't see data appearing in your Stack after following the steps, visit the Help Centre guide for steps to diagnose no data appearing in your Stack or Chat to support now.
Step 8 - Logstash Overview
What Is Logstash?
Logstash is an open-source light-weight processing pipeline created by Elastic. It is the most popular data pipeline used for Elasticsearch as their close integration allows for powerful log processing capabilities.
What is Logstash Used For?
Logstash makes up an essential part of the ELK Stack as it provides the easiest solution for collecting, parsing, and storing logs for analysis & monitoring.
Logstash’s wide range of over 200 open-source plugins help you easily index your data across multiple log types including error, event & web server files from hundreds of popular integrations. The tool offers an array of input, output, & filter plugins for enriching, and transforming data from a variety of sources including Golang, Google Cloud, & Azure.
What Are The Advantages Of Using Logstash?
Some of the advantages of using this pipeline include high availability & flexibility, due to the wide adoption of ELK & the large community supporting their regularly maintained plugins. Some of the most popular plugins include Logtrailing (for tailing live events), Beats (including Metricbeat & Heartbeat), & TCP.
Logstash also benefits from having a straightforward configuration format that reduces the complexity of getting started with ELK. Data ingestion using Logstash is enriched prior to being indexed by Elasticsearch, making it readily available for filtering, analysis & reporting in Kibana once data is migrated to your ELK Stack