I have a log file that is filled with exceptions that is not useful to me.
It is being generated every two second and when looking at log file that contains 24 hrs of logging it becomes overwhelming to get to the relevant info I need.
My logs look something like this:
2013-04-21 00:00:00,852 [service name] ERROR java-class - Exception at java.net ...... at java.apache .... and 28 more lines like these.
I want to clean up the copy of the log to another file.
Obviously grep -v “string” -A29 foo.log > new_file.log doesn’t help me filter out those 30 lines.
I also tried several sed and awk statements I saw for similar issues others where having. But none of them seem to help.
I am more of network admin getting my feet wet on linux systems.
Can somebody please help?
Advertisement
Answer
This might work for you (GNU sed):
sed '/ERROR java-class - Exception/{:a;$!N;/ns*ats.*/s///;ta;D}' file >new_file
This gathers up all the lines following ERROR java-class - Exeption
that begin with spaces followed by at ...
into one line and then deletes that line. Using the above as a template other exceptions could be filtered in the same manner.