Google Kubernetes Engine Logs
Ship Google Kubernetes Engine Logs to Logstash
Filebeat is an open source shipping agent that lets you ship Google Kubernetes Engine (GKE) Logs to one or more destinations, including Logstash.
Follow this step by step guide to get 'logs' from your system to Logit.io:
Step 1 - Configure GKE
Open Google Kubernetes Engine in Google Cloud, choose to create a cluster, give it a suitable name, and choose a region for your cluster.
Step 2 - Connecting to the cluster
Once the cluster is created you will see a similar screen to the below. Go ahead and hit Connect.
You'll have two options, we're going to choose to Run in Google Cloud Shell.
This will open a terminal window in your browser. (It can take a few seconds to load)
Step 3 - Setting up the terminal
You'll have to run the initial command that comes pre-populated in the terminal to authenticate. Wait a few seconds for this to process and then you're going to download the following Logit.io Filebeat Kubernetes deployment manifest.
To download the Logit.io deployment manifest for GKE, in the GKE terminal run:
wget cdn.logit.io/filebeat-kubernetes.yaml
Step 4 - Deploy Filebeat
Now you have the manifest we need to add your Stack Logstash endpoint details.
Open the filebeat-kubernetes.yaml file in a text editor.
vi filebeat-kubernetes.yaml
or
nano filebeat-kubernetes.yaml
Update the following lines (line 59) in the yaml with your Stack Logstash endpoint and Beats-SSL port.
env:
- name: LOGSTASH_HOST
value: "guid-ls.logit.io"
- name: BEATS_PORT
value: "00000"
After updating the code should look as below.
env:
- name: LOGSTASH_HOST
value: ["your-logstash-host"]
- name: BEATS_PORT
value: ["your-ssl-port"]
Exit and save the file.
Step 5 - Apply your updates
Now we're going to apply the file to the cluster.
kubectl apply -f filebeat-kubernetes.yaml
Step 6 - Confirm Deployment
Confirm your pod has deployed, you should see output similar to that below.
kubectl --namespace=kube-system get ds/filebeat
Browse to Kibana and you should see Logs arriving in your Stack.
In the GKE console you can view the log file to confirm that Logs are being sent to your Logit.io Stack using the following command.
kubectl logs ["podname"] --namespace=kube-system
Step 7 - Check Logit.io for your logs
Now you should view your data:
If you don't see logs take a look at How to diagnose no data in Stack below for how to diagnose common issues.
Step 8 - how to diagnose no data in Stack
If you don't see data appearing in your Stack after following the steps, visit the Help Centre guide for steps to diagnose no data appearing in your Stack or Chat to support now.
Step 9 - GKE Logging Overview
Google Kubernetes Engine (often shortened to GKE) is a managed service for running Kubernetes that bypasses the need to install and operate the clusters typically required to run the popular container management tool.
It is often used for managing containers due to container’s increased proficiency for meeting the performance and scalability requirements demanded when new enterprise level applications are being created and subsequently require testing and maintaining.
Google officially open-sourced Kubernetes in 2014 and since then it quickly became one of the most popular container management tools. By offering this as a managed service, Google provides an easy route for users that don’t wish to maintain K8s but still want the benefits that lead to lowering of their cloud computing costs.
As part of this service GKE also provides users with a sandbox that gives an additional layer of workload security, private clusters can also be restricted to a private endpoint which operates as a benefit for organisations that require enhanced security.
It is important that when using Google Kubernetes Engine that you make sure that cloud logging is not disabled. When this functionality is disabled it can make troubleshooting incidents extremely difficult and if a pod is removed it may make these logs nearly impossible to recover for effective root cause analysis to be undertaken.
GKE collects logs for all of the following areas; cluster audit data, worker nodes, application files and system logs. As important as making sure GKE cloud logging is enabled it is also vital that your organisation also makes allowances for long term storage of your Google Kubernetes Engines within a centralised logging platform.
Logit.io provides centralised log management that brings together all logs from managed services, K8s, applications, servers and programming languages in a single affordable platform to give your engineers a unified view across the entire health of your operating environment.
If you require any further help with migrating your Google Kubernetes Engine log files using Logstash we're here to help. Feel free to get in contact with our support team via live chat & we'll be happy to assist.