Translate filter plugin
Looks up values in a dictionary and writes the mapped result to a destination field. Supports inline dictionaries, external YAML/JSON/CSV files, and optional regex-style matching.
- Package:
logstash-filter-translate - Coverage source: default/bundled
- Official catalog entry: Yes
Plugin overview
translate is used in the Logstash filter stage. Maps field values using dictionary data.
Typical use cases
- Transform fields before indexing to keep schema and naming consistent.
- Prepare high-quality fields for alerts, dashboards, and downstream pipelines.
Input and output behavior
- Flow: Maps input values through a dictionary into normalized output values.
- Input field:
source. - Output target: controlled by
target. - Important options:
source,field,target,dictionary_path.
Options
Required
source(type: string; default: none) — Field whose value is used as the lookup key.
Optional
destination(type: string) — Deprecated alias fortarget; prefertarget.dictionary(type: hash; default:{}) — Inline key/value dictionary used for the lookup.dictionary_path(type: a valid filesystem path; default: none) — Path to an external YAML, JSON, or CSV file containing the dictionary.ecs_compatibility(type: string) — Controls ECS field compatibility behaviour (disabled,v1, orv8).exact(type: boolean; default:true) — When true, use exact matching (default); set false for substring replacement.fallback(type: string; default: none) — Value to use when the lookup misses.field(type: string) — Deprecated alias forsource; prefersource.iterate_on(type: string; default: none) — Iterate the lookup over an array-valued source field.override(type: boolean) — Overwrite the target field if it already has a value.refresh_interval(type: number; default:300) — Seconds between reloads ofdictionary_path.regex(type: boolean; default:false) — When true, treat dictionary keys as regex patterns.refresh_behaviour(type: string; default:merge) — Reload strategy when a dictionary file is updated (mergeorreplace).target(type: string) — Field that receives the mapped value.yaml_dictionary_code_point_limit(type: number; default: 134217728 (128MB for 1 byte code points)) — YAML parse safety limit for dictionary files.yaml_load_strategy(type: string; default:one_shot) — YAML loader strategy;safeis recommended.
Example configuration
filter {
translate {
source => "[http][response][status_code]"
target => "[http][response][status_family]"
dictionary => {
"200" => "success"
"201" => "success"
"301" => "redirect"
"400" => "client_error"
"401" => "unauthorized"
"404" => "not_found"
"500" => "server_error"
}
fallback => "other"
exact => true
override => true
}
}Common options configuration
All Logstash filter plugins support these shared options:
add_field(type: hash; default:{}) — Adds fields when the filter succeeds. Supports dynamic field names and values.add_tag(type: array; default:[]) — Adds one or more tags when the filter succeeds.enable_metric(type: boolean; default:true) — Enables or disables metric collection for this plugin instance.id(type: string; default:none) — Sets an explicit plugin instance ID for monitoring and troubleshooting.periodic_flush(type: boolean; default:false) — Calls the filter flush method at regular intervals.remove_field(type: array; default:[]) — Removes fields when the filter succeeds. Supports dynamic field names.remove_tag(type: array; default:[]) — Removes tags when the filter succeeds.
filter {
translate {
add_field => { "pipeline_stage" => "parsed" }
add_tag => ["parsed", "logstash_filter"]
enable_metric => true
id => "my_filter_instance"
periodic_flush => false
remove_field => ["tmp_field"]
remove_tag => ["temporary"]
}
}Apply in Logit.io
- Open your stack in Logit.io and navigate to Logstash Pipelines.
- In the
filter { ... }section, add atranslateblock. - Save your pipeline changes, then restart the Logstash pipeline if prompted.
- Send sample events and verify parsed/enriched fields in OpenSearch Dashboards.
Validation checklist
- Confirm the
translateblock compiles without syntax errors. - Verify expected new/updated fields exist in sample documents.
- Verify unexpected fields are not removed unless explicitly configured.
- Confirm tags added on success/failure align with your alerting and routing rules.
Troubleshooting
- If events are unchanged, verify your filter condition (
if ...) matches incoming events. - If the pipeline fails to start, validate braces/quotes and retry with a minimal filter block.
- If throughput drops, reduce expensive operations and test with representative sample volume.
References
- GitHub package:
logstash-filter-translate(opens in a new tab) - Canonical catalog: /log-management/ingestion-pipeline/logstash-filters-reference