Get a DemoStart Free TrialSign In

Kubernetes Logging

Ship Kubernetes container logs to your hosted Logstash instance at

Filebeat is a lightweight shipper that enables you to send your Kubernetes logs to Logstash and Elasticsearch. Configure Filebeat using the pre-defined examples below to start sending and analysing your Kubernetes logs.

Send Your DataLogsContainerisationKubernetes Logging Guide

Follow this step by step guide to start sending data from your system to

Step 1 - Copy Manifest File

Copy and use the Kubernetes Filebeat manifest below.

If you aren't logged in, you may need to update the environment variables of your-logstash-host / your-logstash-port.

For use with version 7.x Filebeats.
No input available! Your stack is missing the required input for this data source Talk to support to add the input

Step 2 - Deploy Pod

Now your deployment manifest is updated, you can deploy it using.

kubectl apply -f filebeat-kubernetes.yaml

Step 3 - Confirm Completed Deployment

kubectl --namespace=kube-system get ds/filebeat

kubectl --namespace=kube-system get pods

You should see a pod for each kubernetes node with a name similar to filebeat-abcde listed. The pods should work though from Pending to Running within a couple of minutes as the containers are downloaded and started.

Step 4 - Check for your logs

Data should now have been sent to your Stack.

View my data

If you don't see logs take a look at How to diagnose no data in Stack below for how to diagnose common issues.

Step 5 - how to diagnose no data in Stack

If you don't see data appearing in your Stack after following the steps, visit the Help Centre guide for steps to diagnose no data appearing in your Stack or Chat to support now.

Step 6 - Kubernetes Logging to Opensearch Overview

Kubernetes was open-sourced in 2014 by Google and has quickly become one of the most popular container management tools on the market as it helps to significantly lower the cost of cloud computing & provides a resilient framework for deploying applications.

A common challenge for effective Kubernetes log aggregation is that during spikes data can easily be lost and not accounted for without a scalable logging solution such as Our platform also provides log tailing for real-time monitoring of your Kubernetes metrics.’s container monitoring platform is built to collect, parse, and transform application logs from your Kubernetes clusters within a few steps using the power of the managed Elastic Stack.

Monitor across 1,000s of containers, layers, logs levels, and data types, in one centralised logging platform & save hours on monthly maintenance to support the ELK Stack.

If you need any further help with migrating your Kubernetes log files using Filebeat we're here to help. Feel free to get in contact with our support team via live chat & we'll be happy to assist.

Return to Search
Sign Up

© 2024 Ltd, All rights reserved.