From owner-freebsd-arch Wed Jul 26 1:33:14 2000 Delivered-To: freebsd-arch@freebsd.org Received: from freefall.freebsd.org (freefall.FreeBSD.ORG [204.216.27.21]) by hub.freebsd.org (Postfix) with ESMTP id C77F437BE17; Wed, 26 Jul 2000 01:33:12 -0700 (PDT) (envelope-from kris@FreeBSD.org) Received: from localhost (kris@localhost) by freefall.freebsd.org (8.9.3/8.9.2) with ESMTP id BAA74633; Wed, 26 Jul 2000 01:33:12 -0700 (PDT) (envelope-from kris@FreeBSD.org) X-Authentication-Warning: freefall.freebsd.org: kris owned process doing -bs Date: Wed, 26 Jul 2000 01:33:12 -0700 (PDT) From: Kris Kennaway To: "Jeroen C. van Gelderen" , markm@freebsd.org Cc: arch@freebsd.org Subject: Re: Estimating entropy In-Reply-To: Message-ID: MIME-Version: 1.0 Content-Type: TEXT/PLAIN; charset=US-ASCII Sender: owner-freebsd-arch@FreeBSD.ORG Precedence: bulk X-Loop: FreeBSD.ORG On Tue, 25 Jul 2000, Kris Kennaway wrote: > 2) Keep a frequency table and calculate or estimate the shannon entropy > periodically. This may be feasible if we treat the samples as 8-bit > sources, as you only have to loop over 256 values and calculate a log_2 of > the probabilities (although lack of FP in the kernel would complicate > this) I was thinking about this on the way home, and we can do a big optimisation here if we assume that the measurement have a gaussian distribution (which is probably fairly reasonable for most sources, at least to a first approximation). In that case we only need to know the mean and variance of the last n samples (e.g. stored in a circular buffer), which can be computed incrementally without having to do a full pass over the entire n samples each time, and the entropy has a simple closed-form solution. Kris -- In God we Trust -- all others must submit an X.509 certificate. -- Charles Forsythe To Unsubscribe: send mail to majordomo@FreeBSD.org with "unsubscribe freebsd-arch" in the body of the message