Translate filter plugin

Looks up values in a dictionary and writes the mapped result to a destination field. Supports inline dictionaries, external YAML/JSON/CSV files, and optional regex-style matching.

  • Package: logstash-filter-translate
  • Coverage source: default/bundled
  • Official catalog entry: Yes

Plugin overview

translate is used in the Logstash filter stage. Maps field values using dictionary data.

Typical use cases

  • Transform fields before indexing to keep schema and naming consistent.
  • Prepare high-quality fields for alerts, dashboards, and downstream pipelines.

Input and output behavior

  • Flow: Maps input values through a dictionary into normalized output values.
  • Input field: source.
  • Output target: controlled by target.
  • Important options: source, field, target, dictionary_path.

Options

Required

  • source (type: string; default: none) — Field whose value is used as the lookup key.

Optional

  • destination (type: string) — Deprecated alias for target; prefer target.
  • dictionary (type: hash; default: {}) — Inline key/value dictionary used for the lookup.
  • dictionary_path (type: a valid filesystem path; default: none) — Path to an external YAML, JSON, or CSV file containing the dictionary.
  • ecs_compatibility (type: string) — Controls ECS field compatibility behaviour (disabled, v1, or v8).
  • exact (type: boolean; default: true) — When true, use exact matching (default); set false for substring replacement.
  • fallback (type: string; default: none) — Value to use when the lookup misses.
  • field (type: string) — Deprecated alias for source; prefer source.
  • iterate_on (type: string; default: none) — Iterate the lookup over an array-valued source field.
  • override (type: boolean) — Overwrite the target field if it already has a value.
  • refresh_interval (type: number; default: 300) — Seconds between reloads of dictionary_path.
  • regex (type: boolean; default: false) — When true, treat dictionary keys as regex patterns.
  • refresh_behaviour (type: string; default: merge) — Reload strategy when a dictionary file is updated (merge or replace).
  • target (type: string) — Field that receives the mapped value.
  • yaml_dictionary_code_point_limit (type: number; default: 134217728 (128MB for 1 byte code points)) — YAML parse safety limit for dictionary files.
  • yaml_load_strategy (type: string; default: one_shot) — YAML loader strategy; safe is recommended.

Example configuration

filter {
  translate {
    source      => "[http][response][status_code]"
    target      => "[http][response][status_family]"
    dictionary  => {
      "200" => "success"
      "201" => "success"
      "301" => "redirect"
      "400" => "client_error"
      "401" => "unauthorized"
      "404" => "not_found"
      "500" => "server_error"
    }
    fallback    => "other"
    exact       => true
    override    => true
  }
}

Common options configuration

All Logstash filter plugins support these shared options:

  • add_field (type: hash; default: {}) — Adds fields when the filter succeeds. Supports dynamic field names and values.
  • add_tag (type: array; default: []) — Adds one or more tags when the filter succeeds.
  • enable_metric (type: boolean; default: true) — Enables or disables metric collection for this plugin instance.
  • id (type: string; default: none) — Sets an explicit plugin instance ID for monitoring and troubleshooting.
  • periodic_flush (type: boolean; default: false) — Calls the filter flush method at regular intervals.
  • remove_field (type: array; default: []) — Removes fields when the filter succeeds. Supports dynamic field names.
  • remove_tag (type: array; default: []) — Removes tags when the filter succeeds.
filter {
  translate {
    add_field => { "pipeline_stage" => "parsed" }
    add_tag => ["parsed", "logstash_filter"]
    enable_metric => true
    id => "my_filter_instance"
    periodic_flush => false
    remove_field => ["tmp_field"]
    remove_tag => ["temporary"]
  }
}

Apply in Logit.io

  1. Open your stack in Logit.io and navigate to Logstash Pipelines.
  2. In the filter { ... } section, add a translate block.
  3. Save your pipeline changes, then restart the Logstash pipeline if prompted.
  4. Send sample events and verify parsed/enriched fields in OpenSearch Dashboards.

Validation checklist

  • Confirm the translate block compiles without syntax errors.
  • Verify expected new/updated fields exist in sample documents.
  • Verify unexpected fields are not removed unless explicitly configured.
  • Confirm tags added on success/failure align with your alerting and routing rules.

Troubleshooting

  • If events are unchanged, verify your filter condition (if ...) matches incoming events.
  • If the pipeline fails to start, validate braces/quotes and retry with a minimal filter block.
  • If throughput drops, reduce expensive operations and test with representative sample volume.

References