Get a DemoStart Free TrialSign In

JSON

How to upload JSON logs or JSON log file to Logstash

Follow this step by step guide to get 'logs' from your system to Logit.io:

Step 1 - Send a single JSON log using curl

The first example takes a string in JSON format and passes it through to Logstash via our API

curl -i -H "ApiKey: your-api-key" -i -H "Content-Type: application/json" -H "LogType: default" https://api.logit.io/v2 -d '{"test":"This is a test", "Country":"United Kingdom", "Weather":"Sunny" }'

Your ApiKey can be found by doing the following:

From the stack dashboard page, click the settings button. Now from the left hand menu choose Stack API Keys.

Executing the command above, ensuring that the ApiKey value has the value of the stack where the data is to be logged, will send the string through to Logstash. This can now be viewed by opening Kibana.

If the receiving stack has been set up to parse JSON, the data will have been logged as three new values. If the stack has not been set up to parse JSON the message field will contain a string corresponding with the data that has been sent.

Section 3 of this document has an example of how to update a filter to parse JSON data.

Step 2 - Send JSON log files using curl

It is also possible to test sending a JSON file to Logstash for logging. This example will show how to do this.

Let's have a look at the contents of our sample.json file.

{
"name": "Jason",
"city": "Manchester",
"display": "Hello there from JSON file",
"value": 5
}

The following command should now be entered into the command prompt, this will send the file to Logstash:

curl -i -H "ApiKey: your-api-key" -i -H "Content-Type: application/json" -H "LogType: json" https://api.logit.io/v2 -d @sample.json

As mentioned previously the ApiKey value can be found by doing the following:

From the stack dashboard page, click the settings button. Now from the left hand menu choose Stack API Keys.

You should be able to see the new log record in Kibana. As also mentioned earlier, if the receiving stack has been set up to parse JSON, the data will have been logged as four new values. If the stack has not been set up to parse JSON the message field will contain a string corresponding with the data that has been sent.

The next section shows an example of how to update a filter to parse JSON data.

Step 3 - Logstash filter example for JSON

You can access your stacks Logstash pipeline by doing the following:

Edit Pipelines

Below is an example of the code that needs to be added to a Logstash pipeline in order to have the string or file content sent through identified as JSON and processed as distinct fields and values (if this is required by the user):

if [type] == "json" 
{
    json
    {
        source => "message"
    }
}

or

if [message] =~ /^{.*}/  
{
    json 
    {
        source => "message"
    }
}

Without a filter like this the data sent to Logstash will be treated as a string and logged accordingly in the message field.

This filter is already included in your default Logstash Pipelines!

Step 4 - Check Logit.io for your logs

Data should now have been sent to your Stack.

View my data

If you don't see logs take a look at How to diagnose no data in Stack below for how to diagnose common issues.

Step 5 - how to diagnose no data in Stack

If you don't see data appearing in your Stack after following the steps, visit the Help Centre guide for steps to diagnose no data appearing in your Stack or Chat to support now.

Step 6 - JSON Logging Overview

JSON (also known as JavaScript Object Notation) is a lightweight syntax used for exchanging data. You may have previously encountered the JSON format if you have ever used Serilog for migrating log data from Microsoft’s .Net or .Net Core frameworks.

JSON has almost near-universal support as the vast majority of programming languages can support its Unicode encoding.

JSON has become the preferred format for standardising structured log messages & has taken over from the previously used XML format as the later syntax provides an often more legible and compact format that can be queried as a database.

An exception to this can be found when using the default logging framework for Apache & Nginx as their logs are even more compact than JSON, but are still less suited to flexible parsing.

Our log file viewer supports JSON logs & provides centralised log management built for easy analysis, troubleshooting & testing within your production environment. Logit.io eliminates the need to spend hours searching & tailing across distributed servers for separate JSON logs.

If you need any more help with migrating your JSON logs to Logstash the Logit.io team are here to help. Feel free to visit our Help Centre or get in contact with our support team by sending us a message via live chat & we'll be happy to assist.

Return to Search
Sign Up

© 2024 Logit.io Ltd, All rights reserved.