Mutate filter plugin
General-purpose field editor for Logstash. Use it to rename, convert, replace, remove, copy, split, merge, or case-normalise fields. It is typically the last step before output, where you shape events to match the target schema.
- Package:
logstash-filter-mutate - Coverage source: default/bundled
- Official catalog entry: Yes
Plugin overview
mutate is used in the Logstash filter stage. Renames, converts, removes, and updates fields.
Typical use cases
- Rename, convert, and normalize fields before indexing.
- Clean noisy event payloads by removing temporary or unused fields.
Input and output behavior
- Flow: Applies in-place field mutations such as rename, convert, replace, and remove.
- Input: works on events that match your surrounding
ifconditions. - Output: updates the current event in place unless configured otherwise.
- Important options:
tag_on_failure,convert,copy,gsub. - Failure signaling: uses
tag_on_failure(default:_mutate_error) so failed events can be routed or inspected.
Options
Required
- No required plugin-specific options.
Optional
convert(type: hash; default: none) — Convert field types, for example{ "count" => "integer", "ratio" => "float" }.copy(type: hash; default: none) — Copy fields;{ "src" => "dst" }copiessrctodstwithout removingsrc.gsub(type: array; default: none) — Regex-based substitution on field values (triples offield,pattern,replacement).join(type: hash; default: none) — Join an array field into a single string using a separator.lowercase(type: array; default: none) — Lowercase the listed field values.merge(type: hash; default: none) — Merge two fields together (array into array, hash into hash).coerce(type: hash; default: none) — Supply a default value when a configured field is missing.rename(type: hash; default: none) — Rename fields; accepts either a hash{ "source" => "destination" }or an array.replace(type: hash; default: none) — Replace a field's value; supports%{...}interpolation.split(type: hash; default: none) — Split a string field into an array using a delimiter.strip(type: array; default: none) — Strip leading and trailing whitespace from listed fields.update(type: hash; default: none) — Update a field only when it already exists.uppercase(type: array; default: none) — Uppercase the listed field values.capitalize(type: array; default: none) — Capitalize the listed field values.tag_on_failure(type: string; default:_mutate_error) — Tags applied when a mutate action fails.
Example configuration
filter {
mutate {
rename => {
"[host]" => "[host][name]"
"[src]" => "[source][ip]"
}
convert => {
"[http][response][status_code]" => "integer"
"[http][response][time_ms]" => "float"
}
gsub => [
"[url][path]", "/\\d+", "/:id"
]
lowercase => [ "[log][level]" ]
remove_field => [ "raw_message", "tmp" ]
tag_on_failure => [ "_mutateparsefailure" ]
}
}Common options configuration
All Logstash filter plugins support these shared options:
add_field(type: hash; default:{}) — Adds fields when the filter succeeds. Supports dynamic field names and values.add_tag(type: array; default:[]) — Adds one or more tags when the filter succeeds.enable_metric(type: boolean; default:true) — Enables or disables metric collection for this plugin instance.id(type: string; default:none) — Sets an explicit plugin instance ID for monitoring and troubleshooting.periodic_flush(type: boolean; default:false) — Calls the filter flush method at regular intervals.remove_field(type: array; default:[]) — Removes fields when the filter succeeds. Supports dynamic field names.remove_tag(type: array; default:[]) — Removes tags when the filter succeeds.
filter {
mutate {
add_field => { "pipeline_stage" => "parsed" }
add_tag => ["parsed", "logstash_filter"]
enable_metric => true
id => "my_filter_instance"
periodic_flush => false
remove_field => ["tmp_field"]
remove_tag => ["temporary"]
}
}Apply in Logit.io
- Open your stack in Logit.io and navigate to Logstash Pipelines.
- In the
filter { ... }section, add amutateblock. - Save your pipeline changes, then restart the Logstash pipeline if prompted.
- Send sample events and verify parsed/enriched fields in OpenSearch Dashboards.
Validation checklist
- Confirm the
mutateblock compiles without syntax errors. - Verify expected new/updated fields exist in sample documents.
- Verify unexpected fields are not removed unless explicitly configured.
- Confirm tags added on success/failure align with your alerting and routing rules.
Troubleshooting
- If events are unchanged, verify your filter condition (
if ...) matches incoming events. - If the pipeline fails to start, validate braces/quotes and retry with a minimal filter block.
- Check for
tag_on_failuretags (default:_mutate_error) to quickly isolate parse/mutation failures. - If throughput drops, reduce expensive operations and test with representative sample volume.
References
- GitHub package:
logstash-filter-mutate(opens in a new tab) - Canonical catalog: /log-management/ingestion-pipeline/logstash-filters-reference