How to pipe all log files entries to a database on linux?

Xeoncross asked:

I am thinking about building an application that would read my different server log files, save the parsed data to a database, and then remove the line from the log file.

This is my programer side trying to think of better ways to get all the log files into a useable/searchable application console I can view multiple servers at a time from and program scripts to find correlations or email/txt me while I’m away when something looks wrong.

There seem to be expensive solutions for this already on the market, but I can’t justify the hefty $500-$2000/mo prices.

Anyway, my problem is how do I remove lines from an actively written log file without causing a problem? Some of the log files are rotated (like nginx) while others are not. I suppose the safest thing would be to copy the file, then echo '' > file.log to erase it. I might miss 300ms of writes though.

The other question is whether I even want to erase the logs. I suppose not erasing the logs means I would have to open the file, jump to the end, then work my way backwards until I got to the last known entry.

What would be a good way to pipe log data to an external application/database?

My answer:


Don’t reinvent the wheel.

Use logstash to get your logs off your systems.

Have logstash send the logs to elasticsearch.

Use the kibana front end for analytics.

This combination is so common it’s known as the ELK stack. And it’s all open source.


View the full question and answer on Server Fault.

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.