Hey Guys,
I'm looking for a toolset to create and manage file backups on linux. I'm looking for tools with the unix-workflow in mind, so in my mind the perfect solution would be multiple tools to:
- get a filelist with already scanned (and hashed) files and a directory to scan and create a list of files which need updates (because they were changed or not in the original filelist created by former scans)
- process this new filelist and do the updates (for example copy them to a different folder)
- optional: wrangle the files and recompress them (explained below)
- take the processed files (new output) and group them by a customized size (for example to burn them later to dvd/bluerays)
- update processed filelist with new content
recompression: currently used common files (zip, jpg, mp3, pdf) all use not optimal compression and can be recompressed with up-to-date algorithms to get an extra 20-30% compression factor (in case of jpg and mp3 without loss of quality because only a subset of the whole compression-scheme will be reprocessed). background: they use huffmann-compression which is not optimal but fast, current state-of-the-art compression is asymmetric-numeral-system compression which is as fast as huffmann but compresses better.
most linux backups tools I've found an the net most of the time copy files to networks, but can't handle bundles to burn to dvd.
any hints about existing solutions?
PS: I have already started with my own implementation for this concept, but I have not the time to really manage it and it has not all features I would hope for.