Date: Mon, 7 Sep 2009 05:16:02 -0600 From: Modulok <modulok@gmail.com> To: "parv@pair.com" <parv@pair.com> Cc: freebsd-questions@freebsd.org Subject: Re: Is there such thing as a 'soft checksum' tool? Message-ID: <64c038660909070416y285e0fddv42a43a7c127d4a74@mail.gmail.com> In-Reply-To: <20090907083112.GA2787@holstein.holy.cow> References: <64c038660909050933h25a91edcw56688993f5557ad2@mail.gmail.com> <44skf0c6zq.fsf@lowell-desk.lan> <20090907083112.GA2787@holstein.holy.cow>
next in thread | previous in thread | raw e-mail | index | archive | help
>> Modulok <modulok@gmail.com> writes: >> > I'm not even sure such a tool exists, but it's worth asking: >> > >> > I'm looking for a pseudo-checksum tool for use with catalogging >> > images. For example, a strict checksum algorithm, like the sha >> > family, will produce a dramatically different checksum for two >> > files which differ by only a single bit. I'm looking for >> > something where two images images, which are similar, get a >> > proportionally similar checksum. When I speak of similarities >> > I'm referring to their image patterns. i.e two images of >> > differing sizes, which are otherwise identical, would produce >> > very similar checksums. So the closer the checksums are, the >> > more similar two given images are. >> > >> > Does anyone know of anything like this? > > See if this ... > > http://www.stonehenge.com/merlyn/LinuxMag/col50.html > > > ... fits. > > > - parv *laughs* It makes me feel pretty good after reading how Mr.Schwartz went about it. Before I got any replies I started to think about how I'd do it and began to sketch out an algorithm on the kitchen floor. (Largest black-board in my house.) The general approach was pretty much the same; Recursive bucketing of pixels and generating averaged values down to a user-defined finite limit. Thanks to all who have replied thus far! -Modulok-
Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?64c038660909070416y285e0fddv42a43a7c127d4a74>