Edit: This post is pretty old and Elasticsearch/Logstash/Kibana have evolved a lot since it was written.
Now that you’ve got all your logs flying through logstash into elasticsearch, how to remove old records that are no longer doing anything but consuming space and ram for the index?
These are all functions of elasticsearch. Deleting is pretty easy, as is closing an index.
I like the idea of being able to let a cron job kick off the cleanup so I don’t forget.
To install, we’ll have to instal pip.
sudo apt-get install python-pip
Then use pip to install elasticsearch-curator
pip install elasticsearch-curator
When making a cron job, I always use full paths
which curator /usr/local/bin/curator
edit the crontab. Any user should have access so I’ll run this under my user.
Add the following line to run curator at 20 minutes past midnight (system time) and connect to the elasticsearch node on 127.0.0.1 and delete all indexes older than 120 days and close all indexes older than 90 days.
20 0 * * * /usr/local/bin/curator --host 127.0.0.1 -d 120 -c 90
If you prefer an alternative, here’s one written in perl.