Ready to get going? Start your 14 days free trial today

Start free trial

Have an account? Sign in

Send data via JSON to your Logstash instance provided by Logit.io

JSON

How to upload JSON logs or JSON log file to Logstash

Step 1 - Send a single JSON log using curlCopy

The first example takes a string in JSON format and passes it through to Logstash via our API

curl -i -H "ApiKey: your-api-key" -i -H "Content-Type: application/json" -H "LogType: default" https://api.logit.io/v2 -d '{"test":"This is a test", "Country":"United Kingdom", "Weather":"Sunny" }'

Your ApiKey can be found by doing the following:

From the stack dashboard page, click the settings button. Now from the left hand menu choose Stack API Keys.

Executing the command above, ensuring that the ApiKey value has the value of the stack where the data is to be logged, will send the string through to Logstash. This can now be viewed by opening Kibana.

If the receiving stack has been set up to parse JSON, the data will have been logged as three new values. If the stack has not been set up to parse JSON the message field will contain a string corresponding with the data that has been sent.

Section 3 of this document has an example of how to update a filter to parse JSON data.

Step 2 - Send JSON log files using curlCopy

It is also possible to test sending a JSON file to Logstash for logging. This example will show how to do this.

Let's have a look at the contents of our sample.json file.

{
"name": "Jason",
"city": "Manchester",
"display": "Hello there from JSON file",
"value": 5
}

The following command should now be entered into the command prompt, this will send the file to Logstash:

curl -i -H "ApiKey: your-api-key" -i -H "Content-Type: application/json" -H "LogType: json" https://api.logit.io/v2 -d @sample.json

As mentioned previously the ApiKey value can be found by doing the following:

From the stack dashboard page, click the settings button. Now from the left hand menu choose Stack API Keys.

You should be able to see the new log record in Kibana. As also mentioned earlier, if the receiving stack has been set up to parse JSON, the data will have been logged as four new values. If the stack has not been set up to parse JSON the message field will contain a string corresponding with the data that has been sent.

The next section shows an example of how to update a filter to parse JSON data.

Step 3 - Logstash filter example for JSONCopy

You can access your stacks Logstash filter by doing the following:

From the stack dashboard page, click the settings button. Now from the left hand menu choose Logstash Filters.

Below is an example of the code that needs to be added to a Logstash filter in order to have the string or file content sent through identified as JSON and processed as distinct fields and values (if this is required by the user):

if [type] == "json" 
{
    json
    {
        source => "message"
    }
}

or

if [message] =~ /^{.*}/  
{
    json 
    {
        source => "message"
    }
}

Without a filter like this the data sent to Logstash will be treated as a string and logged accordingly in the message field.

This filter is already included in your default Logstash Filters!

Step 4 - JSON Logging OverviewCopy

JSON (also known as JavaScript Object Notation) is a lightweight syntax used for exchanging data. You may have previously encountered the JSON format if you have ever used Serilog for migrating log data from Microsoft’s .Net or .Net Core frameworks.

JSON has almost near-universal support as the vast majority of programming languages can support its Unicode encoding.

JSON has become the preferred format for standardising structured log messages & has taken over from the previously used XML format as the later syntax provides an often more legible and compact format that can be queried as a database.

An exception to this can be found when using the default logging framework for Apache & Nginx as their logs are even more compact than JSON, but are still less suited to flexible parsing.

Our log file viewer supports JSON logs & provides centralised log management built for easy analysis, troubleshooting & testing within your production environment. Logit eliminates the need to spend hours searching & tailing across distributed servers for separate JSON logs.

If you need any more help with migrating your JSON logs to Logstash the Logit team are here to help. Feel free to visit our Help Centre or get in contact with our support team by sending us a message via live chat & we'll be happy to assist.

Toggle View

Expand View

Return to Search