Start your 14-day free trial today & Get 20% Off All Annual Managed ELK Plans

No Credit Card Required

Try Logit.io Free

Already have an account? Sign In

Send data via Google Kubernetes Engine Logs to your Logstash instance provided by Logit.io

Google Kubernetes Engine Logs

Ship Google Kubernetes Engine Logs to Logstash

Filebeat is an open source shipping agent that lets you ship Google Kubernetes Engine (GKE) Logs to one or more destinations, including Logstash.

Step 1 - Configure GKECopy

Open Google Kubernetes Engine in Google Cloud, choose to create a cluster, give it a suitable name, and choose a region for your cluster.

Create a cluster

Step 2 - Connecting to the clusterCopy

Once the cluster is created you will see a similar screen to the below. Go ahead and hit Connect.

Connect to cluster

You'll have two options, we're going to choose to Run in Google Cloud Shell.

Run in Cloud Shell

This will open a terminal window in your browser. (It can take a few seconds to load)

Wait for terminal to load

Step 3 - Setting up the terminalCopy

You'll have to run the initial command that comes pre-populated in the terminal to authenticate. Wait a few seconds for this to process and then you're going to download the following Logit.io Filebeat Kubernetes deployment manifest.

To download the Logit.io deployment manifest for GKE, in the GKE terminal run:

wget cdn.logit.io/filebeat-kubernetes.yaml

Step 4 - Deploy FilebeatCopy

Now you have the manifest we need to add your Stack Logstash endpoint details.

Open the filebeat-kubernetes.yaml file in a text editor.

vi filebeat-kubernetes.yaml

or

nano filebeat-kubernetes.yaml

Update the following lines (line 59) in the yaml with your Stack Logstash endpoint and Beats-SSL port.

Note: The code snippet occurs two times in the yaml file and needs updating in both.
env:
- name: LOGSTASH_HOST
  value: "guid-ls.logit.io"
- name: BEATS_PORT
  value: "00000"

After updating the code should look as below.

env:
- name: LOGSTASH_HOST
  value: ["your-logstash-host"]
- name: BEATS_PORT
  value: ["your-ssl-port"]

Exit and save the file.

Step 5 - Apply your updatesCopy

Now we're going to apply the file to the cluster.

kubectl apply -f filebeat-kubernetes.yaml
If you need to apply further updates after running the apply command you may need to remove the yaml file, make your changes and then apply again.

Step 6 - Confirm DeploymentCopy

Confirm your pod has deployed, you should see output similar to that below.

kubectl --namespace=kube-system get ds/filebeat

Browse to Kibana and you should see Logs arriving in your Stack.

In the GKE console you can view the log file to confirm that Logs are being sent to your Logit.io Stack using the following command.

kubectl logs ["podname"] --namespace=kube-system

Step 7 - GKE Logging OverviewCopy

Google Kubernetes Engine (often shortened to GKE) is a managed service for running Kubernetes that bypasses the need to install and operate the clusters typically required to run the popular container management tool.

It is often used for managing containers due to container’s increased proficiency for meeting the performance and scalability requirements demanded when new enterprise level applications are being created and subsequently require testing and maintaining.

Google officially open-sourced Kubernetes in 2014 and since then it quickly became one of the most popular container management tools. By offering this as a managed service, Google provides an easy route for users that don’t wish to maintain K8s but still want the benefits that lead to lowering of their cloud computing costs.

As part of this service GKE also provides users with a sandbox that gives an additional layer of workload security, private clusters can also be restricted to a private endpoint which operates as a benefit for organisations that require enhanced security.

It is important that when using Google Kubernetes Engine that you make sure that cloud logging is not disabled. When this functionality is disabled it can make troubleshooting incidents extremely difficult and if a pod is removed it may make these logs nearly impossible to recover for effective root cause analysis to be undertaken.

GKE collects logs for all of the following areas; cluster audit data, worker nodes, application files and system logs. As important as making sure GKE cloud logging is enabled it is also vital that your organisation also makes allowances for long term storage of your Google Kubernetes Engines within a centralised logging platform.

Logit.io provides centralised log management that brings together all logs from managed services, K8s, applications, servers and programming languages in a single affordable platform to give your engineers a unified view across the entire health of your operating environment.

If you require any further help with migrating your Google Kubernetes Engine log files using Logstash we're here to help. Feel free to get in contact with our support team via live chat & we'll be happy to assist.

Toggle View

Expand View

Return to Search