Start your 14-day free trial today & Get 20% Off All Annual Managed ELK Plans

No Credit Card Required

Try Logit.io Free

Already have an account? Sign In

Send data via Azure Kubernetes Logs to your Logstash instance provided by Logit.io

Azure Kubernetes Logs

Collect and ship Azure Kubernetes container logs to Logstash and Elasticsearch.

Filebeat is a lightweight shipper that helps you monitor Azure Kubernetes Service by collecting logs from the containers running on the host system. Configure Filebeat using the pre-defined examples below to collect and ship Azure Kubernetes logs to Logstash or Elasticsearch.

Step 1 - Download Filebeat Manifest FileCopy

Download the filebeat kubernetes deployment manifest using the link below:

Click to download filebeat-kubernetes.yaml

To do this on your Azure Cluster you can use the following command in the Azure Cloud Shell terminal.

wget https://cdn.logit.io/filebeat-kubernetes.yaml

Step 2 - Insert stack detailsCopy

Now you have the manifest you're going to want to add your Stack details. Open the file in a text editor.

Locate the environment variables controlling the logging destination and enter your Stacks Logstash input information.

You need to change the environment variables around lines 54.

No input available! Your Stack is missing the required input for this data source Talk to support to add the input

Step 3 - Deploy Filebeat podCopy

Now your Filebeat deployment manifest is updated with your Stack details, you can deploy it using the following command in Azure Cloud Shell:

kubectl apply -f filebeat-kubernetes.yaml

Step 4 - Confirm deployment succesfulCopy

Confirm that your pod has successfully been deployed using one or all of the following commands in Azure Cloud Shell:

kubectl get po -A

kubectl --namespace=kube-system get ds/filebeat

kubectl --namespace=kube-system get pods

You should see a pod for each Kubernetes node with a name similar to filebeat-abcde listed.

Toggle View

Expand View

Return to Search