Get a DemoStart Free TrialSign In

Google Kubernetes Engine Logs

Ship Google Kubernetes Engine Logs to Logstash

Filebeat is an open source shipping agent that lets you ship Google Kubernetes Engine (GKE) Logs to one or more destinations, including Logstash.

Send Your DataLogsGoogle CloudGoogle Kubernetes Engine Logs Guide

Follow this step by step guide to get 'logs' from your system to Logit.io:

Step 1 - Configure GKE

Open Google Kubernetes Engine in Google Cloud, choose to create a cluster, give it a suitable name, and choose a region for your cluster.

Create a cluster

Step 2 - Connecting to the cluster

Once the cluster is created you will see a similar screen to the below. Go ahead and hit Connect.

Connect to cluster

You'll have two options, we're going to choose to Run in Google Cloud Shell.

Run in Cloud Shell

This will open a terminal window in your browser. (It can take a few seconds to load)

Wait for terminal to load

Step 3 - Setting up the terminal

You'll have to run the initial command that comes pre-populated in the terminal to authenticate. Wait a few seconds for this to process and then you're going to download the following Logit.io Filebeat Kubernetes deployment manifest.

To download the Logit.io deployment manifest for GKE, in the GKE terminal run:

wget cdn.logit.io/filebeat-kubernetes.yaml

Step 4 - Deploy Filebeat

Now you have the manifest we need to add your Stack Logstash endpoint details.

Open the filebeat-kubernetes.yaml file in a text editor.

vi filebeat-kubernetes.yaml

or

nano filebeat-kubernetes.yaml

Update the following lines (line 59) in the yaml with your Stack Logstash endpoint and Beats-SSL port.

Note: The code snippet occurs two times in the yaml file and needs updating in both.
env:
- name: LOGSTASH_HOST
  value: "guid-ls.logit.io"
- name: BEATS_PORT
  value: "00000"

After updating the code should look as below.

env:
- name: LOGSTASH_HOST
  value: ["your-logstash-host"]
- name: BEATS_PORT
  value: ["your-ssl-port"]

Exit and save the file.

Step 5 - Apply your updates

Now we're going to apply the file to the cluster.

kubectl apply -f filebeat-kubernetes.yaml
If you need to apply further updates after running the apply command you may need to remove the yaml file, make your changes and then apply again.

Step 6 - Confirm Deployment

Confirm your pod has deployed, you should see output similar to that below.

kubectl --namespace=kube-system get ds/filebeat

Browse to Kibana and you should see Logs arriving in your Stack.

In the GKE console you can view the log file to confirm that Logs are being sent to your Logit.io Stack using the following command.

kubectl logs ["podname"] --namespace=kube-system

Step 7 - Check Logit.io for your logs

Data should now have been sent to your Stack.

View my data

If you don't see logs take a look at How to diagnose no data in Stack below for how to diagnose common issues.

Step 8 - how to diagnose no data in Stack

If you don't see data appearing in your Stack after following the steps, visit the Help Centre guide for steps to diagnose no data appearing in your Stack or Chat to support now.

Step 9 - GKE Logging Overview

Sending data to Logit.io from Google Kubernetes Engine (GKE) Logs is a streamlined process that empowers organizations to gain valuable insights into their containerized applications. The integration between GKE and Logit.io simplifies the collection and analysis of logs, ensuring centralized visibility and real-time monitoring for your container orchestration environment. Logit.io's service for GCP logging includes integrations that work in tandem.

Return to Search
Sign Up

© 2024 Logit.io Ltd, All rights reserved.