|Subject:||archiver for a rolling history of a file with sparse changes|
|Posted by:||Phil Carmody (pc+usen…@asdf.org)|
|Date:||Thu, 15 Aug 2019|
I'm generating a couple of megs of (html) data per day and the data
really doesn't change that much from day to day. Is there an archiver
which will store the complete history of a file, taking advantage of
the knowledge of the previous contents the file?
I was hoping ZPAQ would do the job, as it's designed to archive files'
full histories, but I'm convinced it doesn't use this knowledge.
e.g. one 1.2MB file has a diff from day to day of ~200-600KB, of which
half is removed stuff, so effectively noise, so about ~100-300KB of new
data. Every subsequent day's changes I've added to the ZPAQ archive has
expanded it by almost exactly the same size as it was from the 1st day.
I'm sure deltas that are 1/10-1/4 of the size should be compressed to
1/10-1/4 of the size, as they're effectively the same type of data.
Any ideas what would be a suitable program to use?
FOSS on linux preferred, happy to compile from source.
We are no longer hunters and nomads. No longer awed and frightened, as we have
gained some understanding of the world in which we live. As such, we can cast
aside childish remnants from the dawn of our civilization.
-- NotSanguine on SoylentNews, after Eugen Weber in /The Western Tradition/