KV filter plugin
Parses key/value pairs inside a text field into structured fields. Works well for URL query strings, structured syslog suffixes, and simple application logs.
- Package:
logstash-filter-kv - Coverage source: default/bundled
- Official catalog entry: Yes
Plugin overview
kv is used in the Logstash filter stage. Parses key/value text into structured fields.
Typical use cases
- Parse key=value log messages into discrete fields.
- Handle dynamic key/value payloads from app and infrastructure logs.
Input and output behavior
- Flow: Parses key=value strings and emits dynamic fields from those pairs.
- Input field:
source(default:"message"). - Output target: controlled by
target. - Important options:
source,target,tag_on_failure,allow_duplicate_values. - Failure signaling: uses
tag_on_failure(default: [_kv_filter_error].) so failed events can be routed or inspected.
Options
Required
- No required plugin-specific options.
Optional
allow_duplicate_values(type: boolean; default:true) — Keep duplicate values instead of deduplicating them into an array.allow_empty_values(type: boolean; default:false) — Keep keys whose value is an empty string.default_keys(type: hash; default:{}) — Keys to add with default values when missing from the input.ecs_compatibility(type: string) — Controls ECS field compatibility behaviour (disabled,v1, orv8).exclude_keys(type: array; default:[]) — Drop keys listed here.field_split(type: string; default:" ") — String that separates pairs from each other (for example" "or"&").field_split_pattern(type: string; default: none) — Regex alternative tofield_splitfor more complex separators.include_brackets(type: boolean; default:true) — Treat bracket characters as part of values (useful for[and]).include_keys(type: array; default:[]) — Only keep keys listed here.prefix(type: string; default:"") — Prefix added to every extracted key name.recursive(type: boolean; default:false) — Recursively parse values that themselves look like key/value strings.remove_char_key(type: string; default: none) — Characters to remove entirely from every key.remove_char_value(type: string; default: none) — Characters to remove entirely from every value.source(type: string; default:"message") — Field containing the key/value text.target(type: string; default: none) — Parent field to nest parsed pairs under.tag_on_failure(type: array; default: [_kv_filter_error].) — Tags applied when parsing produces no fields.tag_on_timeout(type: string; default:_kv_filter_timeout.) — Tags applied when parsing exceedstimeout_millis.timeout_millis(type: number; default: 30000 (30 seconds).) — Maximum time in milliseconds to spend parsing a single event.transform_key(type: string; default: none) — Transform every key (lowercase,uppercase, orcapitalize).transform_value(type: string; default: none) — Transform every value (lowercase,uppercase, orcapitalize).trim_key(type: string; default: none) — Characters to strip from the beginning and end of each key.trim_value(type: string; default: none) — Characters to strip from the beginning and end of each value.value_split(type: string; default:"=") — String that separates a key from its value (for example"=").value_split_pattern(type: string; default: none) — Regex alternative tovalue_split.whitespace(type: string; default:lenient) — Whitespace handling mode (lenientorstrict).
Example configuration
filter {
kv {
source => "message"
target => "params"
field_split => "&?"
value_split => "="
include_keys => [ "user_id", "session", "action" ]
transform_key => "lowercase"
trim_value => "\"'"
tag_on_failure => [ "_kvparsefailure" ]
}
}Common options configuration
All Logstash filter plugins support these shared options:
add_field(type: hash; default:{}) — Adds fields when the filter succeeds. Supports dynamic field names and values.add_tag(type: array; default:[]) — Adds one or more tags when the filter succeeds.enable_metric(type: boolean; default:true) — Enables or disables metric collection for this plugin instance.id(type: string; default:none) — Sets an explicit plugin instance ID for monitoring and troubleshooting.periodic_flush(type: boolean; default:false) — Calls the filter flush method at regular intervals.remove_field(type: array; default:[]) — Removes fields when the filter succeeds. Supports dynamic field names.remove_tag(type: array; default:[]) — Removes tags when the filter succeeds.
filter {
kv {
add_field => { "pipeline_stage" => "parsed" }
add_tag => ["parsed", "logstash_filter"]
enable_metric => true
id => "my_filter_instance"
periodic_flush => false
remove_field => ["tmp_field"]
remove_tag => ["temporary"]
}
}Apply in Logit.io
- Open your stack in Logit.io and navigate to Logstash Pipelines.
- In the
filter { ... }section, add akvblock. - Save your pipeline changes, then restart the Logstash pipeline if prompted.
- Send sample events and verify parsed/enriched fields in OpenSearch Dashboards.
Validation checklist
- Confirm the
kvblock compiles without syntax errors. - Verify expected new/updated fields exist in sample documents.
- Verify unexpected fields are not removed unless explicitly configured.
- Confirm tags added on success/failure align with your alerting and routing rules.
Troubleshooting
- If events are unchanged, verify your filter condition (
if ...) matches incoming events. - If the pipeline fails to start, validate braces/quotes and retry with a minimal filter block.
- Check for
tag_on_failuretags (default: [_kv_filter_error].) to quickly isolate parse/mutation failures. - If throughput drops, reduce expensive operations and test with representative sample volume.
References
- GitHub package:
logstash-filter-kv(opens in a new tab) - Canonical catalog: /log-management/ingestion-pipeline/logstash-filters-reference