Get a DemoStart Free TrialSign In

Resources

3 min read

Logstash serves as a real-time event processing engine within the OpenSearch ecosystem, which comprises OpenSearch, Beats, and OpenSearch Dashboards. Logstash acts as a centralized pipeline that collects data from numerous sources before forwarding it to OpenSearch.

By choosing to install Logstash alongside OpenSearch you can enhance your log management capabilities and gain from extensive benefits. An example of this is the high scalability and flexibility that Logstash provides. Logstash is designed to manage large amounts of data and can scale horizontally to accommodate increasing workloads. By deploying Logstash alongside OpenSearch, you can distribute the data processing workload via numerous Logstash instances, guaranteeing optimal performance, and scalability for your log management infrastructure.

In addition to this, Logstash allows you to preprocess and transform raw log data before indexing it into OpenSearch. This includes tasks such as parsing log messages, extracting relevant fields, enriching data with additional context (e.g., geolocation information), and filtering out irrelevant or sensitive information. By utilizing Logstash's powerful transformation capabilities, you guarantee that your data is structured and normalized for efficient querying and analysis in OpenSearch.

Also, by deploying Logstash alongside OpenSearch you can gain from its vast integration flexibility. Logstash supports a broad variety of input and output plugins, which enables you to seamlessly integrate with numerous data sources and destinations. Whether you need to collect logs from traditional log files, streaming platforms like Kafka, or cloud services like AWS S3, Logstash offers the necessary plugins to facilitate seamless data integration with OpenSearch.

In this article, we will highlight the benefits of deploying Logstash alongside OpenSearch and how to install Logstash on OpenSearch and begin shipping Logstash events to OpenSearch.

Contents

OpenSearch Logstash Structure

Logstash operates by the user configuring a pipeline that includes three phases, inputs, filters, and outputs. The input component of Logstash accepts events, such as logs, from a variety of sources simultaneously. Also, Logstash offers a variety of input plugins catering to TCP/UDP, file inputs, syslog, Microsoft Windows EventLogs, stdin, HTTP, and more. Additionally, you can employ Beats, an open-source collection of input tools, to gather events. These events are then transmitted to a filter by the input plugin.

The filter component of Logstash is responsible for parsing and optimizing events using multiple methods. Logstash provides an extensive variety of filter plugins designed to alter events before passing them to an output destination. For instance, the grok filter is used to parse unstructured events into distinct fields, while the mutate filter is employed to modify fields. These filters are applied sequentially to process events effectively.

Lastly, the output phase of Logstash forwards the filtered events to one or multiple destinations. Logstash offers a diverse selection of output plugins tailored for numerous destinations, including OpenSearch, TCP/UDP, emails, files, stdout, HTTP, Nagios, and more.

Install Logstash on OpenSearch

To begin installing Logstash alongside OpenSearch you will need to ensure that you have Java Runtime Environment (JRE) version 8 or later installed on your system as Logstash requires Java to run.

  1. Install Logstash: Visit the official Logstash website to download the latest version of Logstash.

  2. Extract Logstash: Once the download is complete, extract the Logstash package to a suitable location on your system. For Linux and macOS, you can use commands like tar -zxvf logstash-.tar.gz to extract the contents of the downloaded archive. For Windows, you can use built-in tools or third-party software to extract the ZIP file.

  3. Configure Logstash: Locate the config directory, which contains configuration files for Logstash. Customize the configuration files based on your requirements. The main configuration file is typically named logstash.yml, while pipeline configurations are stored in separate files within the conf.d directory.

  4. Start Logstash: Open a terminal or command prompt and navigate to the Logstash directory. Run the Logstash executable, usually named logstash or logstash.bat, depending on your operating system and shell. Execute the command to start Logstash, specifying the path to the main configuration file. For example: bin/logstash -f config/logstash.yml

  5. Verify Installation: Once Logstash is running, verify its status by checking the console output for any errors or warnings. You can also access the Logstash web interface (if enabled) to monitor the status and performance of Logstash pipelines.

  6. Install Plugins: Depending on your requirements, you may need to install additional plugins for Logstash to support specific input, filter, or output functionalities. Use the Logstash plugin manager (‘bin/logstash-plugin’) to install plugins from the official plugin repository or custom sources.

  7. Integration with OpenSearch: To send data from Logstash to OpenSearch, configure an output plugin in your Logstash pipeline configuration file. Use the OpenSearch output plugin (‘output { elasticsearch { ... } }’) and specify the connection details for your OpenSearch cluster, including the cluster URL, index name, and authentication credentials if required.

Ship Logstash Events to OpenSearch

With this simple configuration, you can begin shipping Logstash events to an OpenSearch cluster and then visualize these events in OpenSearch dashboards.

  1. Logstash Pipeline Configuration: Open the config/pipeline.conf file and add this configuration.

input { stdin { codec => json } }

output { opensearch { hosts => "https://localhost:9200" user => "admin" password => "admin" index => "logstash-logs-%{+YYYY.MM.dd}" ssl_certificate_verification => false } }

This Logstash pipeline accepts JSON input through the terminal and sents the events to an OpenSearch cluster running locally. Logstash writes the events to an index with the logstash-logs-%{+YYYY.MM.dd} naming convention.

  1. Launch Logstash: Use this command to start Logstash.

$ bin/logstash -f config/pipeline.conf --config.reload.automatic

  1. Add JSON Object: Add a JSON object in the terminal.

{ "amount": 10, "quantity": 2}

  1. Launch OpenSearch Dashboards: Start OpenSearch dashboards and choose dev tools.

GET _cat/indices?v

health | status | index | uuid | pri | rep | docs.count | docs.deleted | store.size | pri.store.size green | open | logstash-logs-2021.07.01 | iuh648LYSnmQrkGf70pplA | 1 | 1 | 1 | 0 | 10.3kb | 5.1kb

If you've enjoyed this article why not read The Top 10 OpenSearch Plugins or Mastering Observability with OpenSearch next?

Get the latest elastic Stack & logging resources when you subscribe

© 2024 Logit.io Ltd, All rights reserved.