Start your 14-day free trial today & Get 20% Off All Annual Managed ELK Plans
No Credit Card Required
Try Logit.io FreeAlready have an account? Sign In
Auditd
Collect and ship Auditd logs to Logstash and Elasticsearch.
Filebeat is a lightweight shipper that enables you to send your Auditd application logs to Logstash and Elasticsearch. Configure Filebeat using the pre-defined examples below to start sending and analysing your Auditd application logs.
Step 1 - Install Filebeat
deb (Debian/Ubuntu/Mint)
curl -L -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-oss-7.8.1-amd64.deb
sudo dpkg -i filebeat-oss-7.8.1-amd64.deb
rpm (CentOS/RHEL/Fedora)
curl -L -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-oss-7.8.1-x86_64.rpm
sudo rpm -vi filebeat-oss-7.8.1-x86_64.rpm
macOS
curl -L -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-oss-7.8.1-darwin-x86_64.tar.gz
tar xzvf filebeat-oss-7.8.1-darwin-x86_64.tar.gz
Windows
- Download the filebeat Windows zip file from the official downloads page.
- Extract the contents of the zip file into C:\Program Files.
- Rename the
filebeat-<version>-windows
directory tofilebeat
. - Open a PowerShell prompt as an Administrator (right-click the PowerShell icon and select Run As Administrator). If you are running Windows XP, you may need to download and install PowerShell.
- Run the following commands to install filebeat as a Windows service:
cd 'C:\Program Files\filebeat'
.\install-service-filebeat.ps1
PowerShell.exe -ExecutionPolicy UnRestricted -File .\install-service-filebeat.ps1
.
Step 2 - Enable the Auditd Module
There are several built in filebeat modules you can use. To enable the Auditd module run.
deb/rpm
sudo filebeat modules list
sudo filebeat modules enable auditd
Additional module configuration can be done using the per module config files located in the modules.d folder, most commonly this would be to read logs from a non-default location
deb/rpm /etc/filebeat/modules.d/
Step 4 - Configure output
We'll be shipping to Logstash so that we have the option to run filters before the data is indexed.
Comment out the elasticsearch output block.
## Comment out elasticsearch output
#output.elasticsearch:
# hosts: ["localhost:9200"]
Uncomment and change the logstash output to match below.
output.logstash:
hosts: ["your-logstash-host:your-ssl-port"]
loadbalance: true
ssl.enabled: true
Step 5 - Validate configuration
Let's check the configuration file is syntactically correct by running filebeat directly inside the terminal.
If the file is invalid, filebeat will print an error loading config file
error message with details on how to correct the problem.
deb/rpm
sudo filebeat -e -c /etc/filebeat/filebeat.yml
macOS
cd <EXTRACTED_ARCHIVE>
./filebeat -e -c filebeat.yml
Windows
cd <EXTRACTED_ARCHIVE>
.\filebeat.exe -e -c filebeat.yml
Step 6 - (Optional) Update Logstash Filters
All Logit stacks come pre-configured with popular Logstash filters. We would recommend that you add Auditd specific filters if you don't already have them, to ensure enhanced dashboards and modules work correctly.
Edit your Logstash filters by choosing Stack > Settings > Logstash Filters
if [fileset][module] == "auditd" {
grok {
match => {
"message" => [ "%{AUDIT_PREFIX} %{AUDIT_KEY_VALUES:[auditd][log][kv]} old auid=%{NUMBER:[auditd][log][old_auid]} new auid=%{NUMBER:[auditd][log][new_auid]} old ses=%{NUMBER:[auditd][log][old_ses]} new ses=%{NUMBER:[auditd][log][new_ses]}", "%{AUDIT_PREFIX} %{AUDIT_KEY_VALUES:[auditd][log][kv]} msg=['\"](%{DATA:[auditd][log][msg]}\s+)?%{AUDIT_KEY_VALUES:[auditd][log][sub_kv]}['\"]", "%{AUDIT_PREFIX} %{AUDIT_KEY_VALUES:[auditd][log][kv]}", "%{AUDIT_PREFIX}", "%{AUDIT_TYPE} %{AUDIT_KEY_VALUES:[auditd][log][kv]}" ]
}
pattern_definitions => {
"AUDIT_TYPE" => "^type=%{NOTSPACE:[auditd][log][record_type]}"
"AUDIT_PREFIX" => "%{AUDIT_TYPE} msg=audit\(%{NUMBER:[auditd][log][epoch]}:%{NUMBER:[auditd][log][sequence]}\):(%{DATA})?"
"AUDIT_KEY_VALUES" => "%{WORD}=%{GREEDYDATA}"
}
}
date {
match => [ "[auditd][log][epoch]", "UNIX" ]
target => "@timestamp"
}
mutate {
convert => { "[auditd][log][sequence]" => "integer" }
}
geoip {
source => "[auditd][log][addr]"
target => "[auditd][log][geoip]"
}
}