topbanner_forum
  *

avatar image

Welcome, Guest. Please login or register.
Did you miss your activation email?

Login with username, password and session length
  • Friday December 13, 2024, 10:26 pm
  • Proudly celebrating 15+ years online.
  • Donate now to become a lifetime supporting member of the site and get a non-expiring license key for all of our programs.
  • donate

Author Topic: Looking for a compression (algorithm) "Best Practices" sheet  (Read 5623 times)

tinjaw

  • Supporting Member
  • Joined in 2006
  • **
  • Posts: 1,927
    • View Profile
    • Donate to Member
Looking for a compression (algorithm) "Best Practices" sheet
« on: February 05, 2008, 10:33 AM »
I understand enough about compression to know that there isn't one compression algorithm to rule them all. However, I don't know enough to know which ones are best for various tasks. For example, it is only elsewhere on these forums that I learned a best practice for when only WinZip is available. What I would like to have is a list of these Best Practices. For example, what if I want to archive a whole directory of files of mixed types, some of which may already be compressed to some degree. I don't care how long it takes but I do want it to be absolutely as small as I can get it.

Does anybody know of such a document?

nontroppo

  • Charter Honorary Member
  • Joined in 2005
  • ***
  • Posts: 649
  • spinning top
    • View Profile
    • nontroppo.org
    • Donate to Member
Re: Looking for a compression (algorithm) "Best Practices" sheet
« Reply #1 on: February 05, 2008, 10:40 AM »
I don't know of a document per se, but this site tests various compression apps against single file and multiple documents. From it I think you can work out which apps are best for which tasks:

http://www.maximumco...ssion.com/index.html
FARR Wishes: Performance TweaksTask ControlAdaptive History
[url=http://opera.com/]

Renegade

  • Charter Member
  • Joined in 2005
  • ***
  • Posts: 13,291
  • Tell me something you don't know...
    • View Profile
    • Renegade Minds
    • Donate to Member
Re: Looking for a compression (algorithm) "Best Practices" sheet
« Reply #2 on: February 05, 2008, 05:27 PM »
Maximum Compression is an excellent site. However, it's focus is on literally that, and not on efficiency. There's a huge difference. Search for "Calgary Corpus" -- good compression benchmark test. There is another one that I'm forgetting at the moment too.

If you're looking for the smallest size, check out the PAQ8i or similar PAQ compression tools. However, keep in mind that PAQ compression at maximum is SLOW! The gains generally aren't worth it.

A few good compression algorithms with high efficiency: RAR, PPMd, ACE, BZip2. They vary, but all achieve good efficiency. (They don't beat PAQ for maximum compression though.)

No - not touting ALZ compression as it's geared towards massive archives, and not high compression ratios. It uses DEFLATE/ZIP compression, which while good, aren't the best for high compression ratios.

I don't care how long it takes but I do want it to be absolutely as small as I can get it.

Try PAQ on maximum compression then try to say that again. ;)
Slow Down Music - Where I commit thought crimes...

Freedom is the right to be wrong, not the right to do wrong. - John Diefenbaker

Renegade

  • Charter Member
  • Joined in 2005
  • ***
  • Posts: 13,291
  • Tell me something you don't know...
    • View Profile
    • Renegade Minds
    • Donate to Member
Re: Looking for a compression (algorithm) "Best Practices" sheet
« Reply #3 on: February 05, 2008, 05:35 PM »
Forgot some links:

http://cs.fit.edu/~m...ney/compression/#paq

Look here for ratios:

http://www.maximumco...ion.com/data/hlp.php

(Note - I'm not too pleased at how ALZip ranks there, but keep in mind that the purpose of ALZ isn't maximum compression -- it's from when there was a 4GB limit and aims to overcome that limit - there's no theoretical limit to the size, only disk space limits.)
Slow Down Music - Where I commit thought crimes...

Freedom is the right to be wrong, not the right to do wrong. - John Diefenbaker

f0dder

  • Charter Honorary Member
  • Joined in 2005
  • ***
  • Posts: 9,153
  • [Well, THAT escalated quickly!]
    • View Profile
    • f0dder's place
    • Read more about this member.
    • Donate to Member
Re: Looking for a compression (algorithm) "Best Practices" sheet
« Reply #4 on: February 05, 2008, 06:13 PM »
Do you mean best practices as in which apps to use, or more along the lines of coding/scripting stuff?

I guess another piece of "best practice" is that, when doing solid compression, you typically want files with similar content grouped next to eachother, to (ab)use what's already in the compression dictionary. RAR does this by sorting files based on extension - I dunno if it does it globally or per-folder, though.
- carpe noctem