From owner-freebsd-questions@FreeBSD.ORG Sat Feb 23 10:48:42 2008 Return-Path: Delivered-To: freebsd-questions@FreeBSD.org Received: from mx1.freebsd.org (mx1.freebsd.org [IPv6:2001:4f8:fff6::34]) by hub.freebsd.org (Postfix) with ESMTP id C4BC016A406 for ; Sat, 23 Feb 2008 10:48:42 +0000 (UTC) (envelope-from robin@reportlab.com) Received: from ptb-relay02.plus.net (ptb-relay02.plus.net [212.159.14.213]) by mx1.freebsd.org (Postfix) with ESMTP id 81DCA13C465 for ; Sat, 23 Feb 2008 10:48:42 +0000 (UTC) (envelope-from robin@reportlab.com) Received: from [87.114.70.100] (helo=[192.168.0.3]) by ptb-relay02.plus.net with esmtp (Exim) id 1JSrgQ-0008DP-L6 for freebsd-questions@FreeBSD.org; Sat, 23 Feb 2008 10:32:42 +0000 Message-ID: <47BFF649.9060104@jessikat.plus.net> Date: Sat, 23 Feb 2008 10:32:41 +0000 From: Robin Becker User-Agent: Thunderbird 2.0.0.9 (Windows/20071031) MIME-Version: 1.0 To: freebsd-questions@FreeBSD.org Content-Type: text/plain; charset=ISO-8859-1; format=flowed Content-Transfer-Encoding: 7bit X-Plusnet-Relay: 0d32aa94894852a79989012ec2b061bc Cc: Subject: duplicate message removal X-BeenThere: freebsd-questions@freebsd.org X-Mailman-Version: 2.1.5 Precedence: list List-Id: User questions List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-List-Received-Date: Sat, 23 Feb 2008 10:48:42 -0000 We have a bunch of FreeBSD 6.x servers which we administer remotely. As part of that we get the normal root job mails emailed to a mailing list which the admins(mostly me) can inspect at leisure and also use for historical purposes. Trouble is many of the emails get huge because of repeated messages typically stuff like xxx.yyy.com login failures: Feb 22 20:07:54 app3 sshd[56886]: reverse mapping checking getaddrinfo for 216-194-26-66.ny.ny.metconnect.net failed - POSSIBLE BREAKIN ATTEMPT! etc etc All these servers are running denyhosts, but we still see lots of these messages. I was wondering if there's any simple compression script which notices the repetitions (apart from timestamp) and can remove the many duplicates etc etc. -- Robin Becker