Get a DemoStart Free TrialSign In

Local Logstash Configuration

Ship logs to your hosted Logstash instance at Logit

All Logit.io ELK Stacks include highly available hosted Logstash instances, removing the need for installing and maintaining your own Logstash server. Logit.io recommends using Filebeat to ship your logs and metrics to your hosted Logstash instance on Logit.io, you then benefit from predefined filters that can be customised from your dashboard.

Send Your DataLogsShippersLocal Logstash Configuration Guide

Follow this step by step guide to get 'logs' from your system to Logit.io:

Step 1 - Before you begin

Logit.io recommends using Filebeat to ship logs and metrics to your hosted Logstash instance on Logit.io you then benefit from high availability and predefined Logstash pipelines.

We understand that some customers might have a specific requirement to use Logstash locally to ship logs and so provide steps below to configure this integration.

Step 2 - Logstash to Logstash

One option for how to send your data from your local Logstash to your Logit.io ELK stack is to send it via your hosted Logstash instance. To do this you can configure the output on your local Logstash to utilise the tcp-ssl port of your hosted Logstash. Configure the local Logstash output to ship your data to the hosted Logstash as shown below, the data you're sending will need to be valid json content.

output {
  tcp {
    codec => json_lines
    host => "your-logstash-host"
    port => your-ssl-port
    ssl_enable => true
    }
  }

Step 3 - Sending directly to Elasticsearch

Another option for how to send data from your local Logstash instance is to send it directly to Elasticsearch. In order to do this you will need your Stack in Basic Authentication mode. To enable this choose Stack Settings > Elasticsearch and switch authentication mode to basic authentication. Once you have done this edit the output on your local Logstash to look like the below.

output {
  elasticsearch {
    hosts => ["<your-elasticsearch-endpoint-address>:443"]
    user => "<your-elasticsearch-username>"
    password => "<your-elasticsearch-password>"
    manage_template => false
    index => "%{[@metadata][index]}-%{+YYYY.MM.dd}"
    }
  }

While not required it may be worthwhile adding the following filter before the output. This will add metadeta to your logs so it gives the index name the following format logstash-YYYY.MM.DD

filter {
  if ! [@metadata][beat] {
    mutate { add_field => { "[@metadata][index]" => "logstash" } }
    } 
  else {
    mutate { add_field => { "[@metadata][index]" => "%{[@metadata][beat]}" }}
    }
  }

Step 4 - Check Logit.io for your logs

Data should now have been sent to your Stack.

View my data

If you don't see logs take a look at How to diagnose no data in Stack below for how to diagnose common issues.

Step 5 - how to diagnose no data in Stack

If you don't see data appearing in your Stack after following the steps, visit the Help Centre guide for steps to diagnose no data appearing in your Stack or Chat to support now.

Step 6 - Logstash dashboard

The Logstash module comes with predefined Kibana dashboards. To view your dashboards for any of your Logit.io stacks, launch Logs and choose Dashboards.

Predefined kibana dashboard screenshot

Step 7 - Logstash Overview

What Is Logstash?

Logstash is an open-source light-weight processing pipeline created by Elastic. It is the most popular data pipeline used for Elasticsearch as their close integration allows for powerful log processing capabilities.

What is Logstash Used For?

Logstash makes up an essential part of the ELK Stack as it provides the easiest solution for collecting, parsing, and storing logs for analysis & monitoring.

Logstash’s wide range of over 200 open-source plugins help you easily index your data across multiple log types including error, event & web server files from hundreds of popular integrations. The tool offers an array of input, output, & filter plugins for enriching, and transforming data from a variety of sources including Golang, Google Cloud, & Azure.

What Are The Advantages Of Using Logstash?

Some of the advantages of using this pipeline include high availability & flexibility, due to the wide adoption of ELK & the large community supporting their regularly maintained plugins. Some of the most popular plugins include Logtrailing (for tailing live events), Beats (including Metricbeat & Heartbeat), & TCP.

Logstash also benefits from having a straightforward configuration format that reduces the complexity of getting started with ELK. Data ingestion using Logstash is enriched prior to being indexed by Elasticsearch, making it readily available for filtering, analysis & reporting in Kibana once data is migrated to your ELK Stack

Return to Search
Sign Up

© 2024 Logit.io Ltd, All rights reserved.